When will Video Card prices come down lower?

The money is there. And the ability to use gpus as a second processor is only more tempting now. I have hope. As long as intel doesn't pull the plug too soon.

I think they're giving it all they have by Intel standards. By that I mean if this launch doesn't gain a foothold, they'll drop it. But at the same time, they will pay out whatever it takes for it to get that foothold, and it's clear that they are all in so far, from their node selection, to their segmentation, to their stated additional features.

How weird is it to root for Intel to knock back AMD and Nvidia? Like it's 20 years ago, except it's also like taking on then-ATI.
 
i'm very close to people who work at intel. Intel also has a bad habit of not seeing things through and dropping projects before they ever bear fruit and spend literally millions to only break even. Larrabee is a PERFECT example of what should have been continued to be invested into since they are doing alchemist now.

The money is there. And the ability to use gpus as a second processor is only more tempting now. I have hope. As long as intel doesn't pull the plug too soon.
Intel continued developing larrabee for 3 generations after. They made them into decent x86 processors after developing past the coprocessor stage.

I still run the successors to larrabee and they hold their own for many HPC loads.
 
Intel discrete graphics is -- debatably -- two or three generations out after its first failure to precipitate.
I was talking specifically about the revisions to the arch that made larrabee.

An actually dgpu for gaming is something intel hasnt seriously tried for a very long time. However they have been developing igpus for enough time that I wouldn't say they are completely foreign to it.
 
An actually dgpu for gaming is something intel hasnt seriously tried for a very long time.

They've been trying it non-stop for like, seven or eight years? From concept to where we are, that feels about right. Plus or minus a year or so.

This is the big, last hurrah. Unless there's a sparkle, a gleam of hope, the same way that Pentium M paved the way for Core. But they're already way more invested, and I don't think it's a glitter, but rather, a beacon.
 
They've been trying it non-stop for like, seven or eight years? From concept to where we are, that feels about right. Plus or minus a year or so.

This is the big, last hurrah. Unless there's a sparkle, a gleam of hope, the same way that Pentium M paved the way for Core. But they're already way more invested, and I don't think it's a glitter, but rather, a beacon.
Nothing was stopping them from fabing a capable piece of silicone for years. Intel can make a GPU there just wasn't any market for a underperforming consumer GPU up until now. Past attempts were software solutions on hpc accelerators. Fairly different then anything a consumer would want. (And those hpc accelerators were fairly capable even making there way into a handful of supercomputers)
 
Nothing was stopping them from fabing a capable piece of silicone for years.

Uh, time was. It takes five to seven years to bring a concept into production in terms of silicon.
 
intel's gonna have to screw up bad to not sell GPUs in this market. If performance lags they can just drop prices til they sell. Their timing is perfect.
 
Exactly 3 more years. 2025. Covid caused 3 years of stuff to become harder to get. If it ends at end of 2022, then it will require the year 23 and 24 to produce the stuff so by 25 everything is on the store shelves as it should be.
Don't think crypto crash will resolve gaming card shortage. Cards are needed everywhere. Everyone on this planet wants to upgrade and is waiting for price to become normal. So take away crypto miners that are using graphic cards and you will still have a very high demand for cards. Even if crypto market crashes a little, the miners will still continue because they know the prices always rebounds after drop. Crypto is here to stay forever. And semiconductors and other parts needed to make gaming card won't be getting 100% manufacturing capacity because other stuff will need to be manufactured too. You can't have a world with factories only focusing on gaming cards.
 
i'm very close to people who work at intel. Intel also has a bad habit of not seeing things through and dropping projects before they ever bear fruit and spend literally millions to only break even. Larrabee is a PERFECT example of what should have been continued to be invested into since they are doing alchemist now.

The money is there. And the ability to use gpus as a second processor is only more tempting now. I have hope. As long as intel doesn't pull the plug too soon.
Interesting about Intel. What with their product roadmaps, you might have throught that they had a 'stragetic long range' plan. Maybe not. I will make the controversial observation that tech companies run by techies don't always make the best business decisions, just the best technology decisions. (I saw this up close and personal at Novell.) As long as the best technology decisions produce a good those leaders look good. Otherwise, you have companies like Novell as a sad example.
 
Interesting about Intel. What with their product roadmaps, you might have throught that they had a 'stragetic long range' plan. Maybe not. I will make the controversial observation that tech companies run by techies don't always make the best business decisions, just the best technology decisions. (I saw this up close and personal at Novell.) As long as the best technology decisions produce a good those leaders look good. Otherwise, you have companies like Novell as a sad example.

Plans have to be adapted to match the marketplace demand. Long term roadmaps are kind of a sick joke because who could have predicted this shortage in the first place? There are obvious bad ideas like Itanium. The world wasn't ready and probably just barely ready to drop X86 compatibility.

I'll put it this way. I have no strong desire to ever work for intel.
 
What with their product roadmaps, you might have throught that they had a 'stragetic long range' plan. Maybe not.

They are attempting to salvage what would have been a pretty significant set of contracts; this originally started out as a dedicated video streaming card for I want to say Amazon, IIRC. They were going to build server farms for streaming video services.

But it's obviously a flexible design, since they're clearly going to tailor it to a totally different market and purpose.

Intel is in a pretty unique position to do stuff like this every once in a while. Most of the time, the way these things are locked into place, if a product stinks, companies just have to live with their decisions. Like AMD with Bulldozer.
 
'Gamers' will not like this but the answer is NEVER. Any other answer is a waste of time.

That's not a realistic answer whatsoever. It's whatever the market will accept. Saying "NEVER" is almost guaranteed to be incorrect as your don't have any evidence to support your claim. The chip shortage isn't going to be around forever. Prices should come back to some sort of reason. It's to the point where the us government is interceding and saying "We need more chips"
 
That's not a realistic answer whatsoever. It's whatever the market will accept. Saying "NEVER" is almost guaranteed to be incorrect as your don't have any evidence to support your claim. The chip shortage isn't going to be around forever. Prices should come back to some sort of reason. It's to the point where the us government is interceding and saying "We need more chips"
There is not only institutional but also public sector backing of crypto now and that means never...just more words to arrive at the same conclusion. You no longer 'buy' a GPU but 'aspire to acquire' one at a decent price, nuff said, merry Xmas.
 
I think the rent estimated for a childcare facility was off a little to a lot depending on your location. I live in AR and inherited a few buildings, one of which is a large building in Jacksonville, AR that has been leased by a lady with a very severely autistic son for years. They pay $3,000/mo for this thing and they do ~daycare for the severely autistic. I haven't seen their books except the 3k/mo in my books, and occasional things like hvac maintenance and when it got down to -1f and froze for a week in February and the associated plumbing remediation.

It is my understanding that rents and property prices are higher in denser areas. Jacksonville is not a very wealthy town or particularly dense. I wouldn't know what this building looked like if one of the top 25 lunch spots in the state weren't down the street from it. They have the Best catfish and hushpuppies I've ever consumed.
 
There is not only institutional but also public sector backing of crypto now and that means never...just more words to arrive at the same conclusion. You no longer 'buy' a GPU but 'aspire to acquire' one at a decent price, nuff said, merry Xmas.

Bullshit. Based on what? You need to cite evidence if you want to say "never". You're making insane remarks with very little supporting evidence. "Crypto will keep pushing" Is not evidence. You are trying to backtrack without any actual proof that this is a "never" scenario.

I cited actual things if you had any patience to actually read. https://www.windowscentral.com/chipmakers-share-data-us-government-combat-global-chip-shortage, " Congress could fund the bipartisan CHIPS for America Act"

How many years would your "never" scenario last for? What market influences have to come and go to prove your idea?

If you want to talk trash, that's fine. But talking trash to me? I will verbally shred you if you can't put up any hard evidence.

Game, Set, match.
 
Bullshit. Based on what? You need to cite evidence if you want to say "never". You're making insane remarks with very little supporting evidence. "Crypto will keep pushing" Is not evidence. You are trying to backtrack without any actual proof that this is a "never" scenario.

I cited actual things if you had any patience to actually read. https://www.windowscentral.com/chipmakers-share-data-us-government-combat-global-chip-shortage, " Congress could fund the bipartisan CHIPS for America Act"

How many years would your "never" scenario last for? What market influences have to come and go to prove your idea?

If you want to talk trash, that's fine. But talking trash to me? I will verbally shred you if you can't put up any hard evidence.

Game, Set, match.
Someone clearly needs to chill on Christmas 😂 Or do you imagine yourself a real 'Viscount'? :) Feel free to verbally shred into a void for there is a very handy ignore button...activated.
 
What is amazing to me is how little the 1970 / 1980 needed or used in programs to help kids as we didn't have computers and had to think outside the box or websters , now they worship your children and pay you to have them ... notting is coming back to the simple era of life .
 


Intel expects to beat Hopper MCM with DG3 tiles in 2023. That means that not only do they expect to do well in 2022, they are using 2022 to develop a market foothold in terms of market and mindshare and is focused on drivers, not hardware, for the time being. That sounds like a super-confident place to be.

Also, if this is right, they don't even think of AMD as a competitor, not because of product quality, but because of Intel's belief that they can out-produce.

The only way Intel can beat out AMD and Nvidia in terms of production is if they beat them in terms of price. Because if AMD and/or Nvidia have better products at lower prices, people will walk on Intel unless they're desperate. That's not how Intel rolls, never has been, never will be.

Expect prices to drop as soon as Intel starts shipping desktop parts, so maybe late Q2, early Q3?
 


Raja Koduri is saying Arc isn't intended to make a profit, possibly confirming selling at a loss or just break-even.
 
I like to think that NVidia is trying to be like OPEC. When the price of oil gets too low, they agree on cutbacks in production.

And NVidia and AMD don't have to meet in some hotel room to "fix" prices. They don't have to. They just watch either other's behavior, just like the US airlines.

Si=i KazeoHin is right on all point in this post. These new prices may put a crimp on gamers' ability to upgrade, but the game software companies would have to pressure NVidia (and AMD) to reduce prices. In other words, hahahahaha.
That really doesn't make much sense considering their cards are selling at nearly 3 times what they priced them at when they launched. The 3070 was touted as the greatest card of all time because it had elite performance at a mid-level price point. They literally cannot manufacture the cards fast enough to bring down the price.
 
That really doesn't make much sense considering their cards are selling at nearly 3 times what they priced them at when they launched. The 3070 was touted as the greatest card of all time because it had elite performance at a mid-level price point. They literally cannot manufacture the cards fast enough to bring down the price.
Q3 sales numbers showed highest production numbers ever for AMD and Nvidia. Yet we still have people even here who think that the world gained that many PC enthusiasts and gamers overnight which explains the demand.
 

Attachments

  • are-you-serious-spiderman.gif
    are-you-serious-spiderman.gif
    8.8 MB · Views: 25
What is considered "normal" these days?

Aside from a bit of a chip switch compared to Turing (i.e. the 3080 having the 102 instead of the 104), the MSRP's themselves are largely still the same crap from Turing era that nobody liked. So even if we eventually go back to having products available at their MSRP's, they are not great and will still be overpriced compared to what we used to get. I am not sure anything really goes back to "normal" ever again.

If you are asking when prices go back to MSRP's, I am just going to say don't hold your breath. Companies like Nvidia are perfectly happy selling their cards to miners for as much as they can until that ship sails. I feel like I have heard ETH was supposed to go away from proof of work for years now and yet here we still are. I'll believe it when I see it...

Between miners, supply chain, etc. there's no easy answer here, or easily perceivable timeframe to give an answer.
 
With inflation and complexity of chips designs year after year and the increased costs of the equipment to make said chips i do not think prices are going to go down ever.
 
Intel Arc 512/16GB and 384/12GB are expected to launch in March next year, to compete with the 3070 series and 3060 series, respectively.
 
What is considered "normal" these days?

Aside from a bit of a chip switch compared to Turing (i.e. the 3080 having the 102 instead of the 104), the MSRP's themselves are largely still the same crap from Turing era that nobody liked. So even if we eventually go back to having products available at their MSRP's, they are not great and will still be overpriced compared to what we used to get. I am not sure anything really goes back to "normal" ever again.

If you are asking when prices go back to MSRP's, I am just going to say don't hold your breath. Companies like Nvidia are perfectly happy selling their cards to miners for as much as they can until that ship sails. I feel like I have heard ETH was supposed to go away from proof of work for years now and yet here we still are. I'll believe it when I see it...

Between miners, supply chain, etc. there's no easy answer here, or easily perceivable timeframe to give an answer.
I think that gamers are going to face the fact that games are no longer the central focus for GPU makers. It's the miners, the AI developers, the datacenters, the advanced "creative" programs, etc., and all those are business uses which have pretty deep pockets for what they need to run their business.
 
I think that gamers are going to face the fact that games are no longer the central focus for GPU makers. It's the miners, the AI developers, the datacenters, the advanced "creative" programs, etc., and all those are business uses which have pretty deep pockets for what they need to run their business.
Sadly you're right
 
I think that gamers are going to face the fact that games are no longer the central focus for GPU makers. It's the miners, the AI developers, the datacenters, the advanced "creative" programs, etc., and all those are business uses which have pretty deep pockets for what they need to run their business.
I've seen Ubisoft, Bethesda and such companies promote Nvidia's and AMD's high profile cards.Realtime raytracing was improved a lot in 3000 series so there is work done for games. How much of focus it gets, I don't know.
LHR cards proved they were not designed for what they were meant to do since their price skyrocketed. LHR methods should be improved in RDNA3 and 4000 series so that gamers get better products.
 
Last edited:
As long as fab capacity is constrained Intel is not going to make any difference. Intel is using TSMC, not its own fabs for GPU.
 
As long as fab capacity is constrained Intel is not going to make any difference. Intel is using TSMC, not its own fabs for GPU.
Right, and we should (the free world) invade conquer China to ensure they do attack Taiwan. Or tow the island to California. Either one should work.

Seriously hoping I can get an upgrade for at least one of my 580s that doesn't cost over 450usd. When I bought them I figured mgpu would be a thing in the near future, seems not, but that would be acceptable as well.
 
As long as fab capacity is constrained Intel is not going to make any difference. Intel is using TSMC, not its own fabs for GPU.

Not the same node. And while there are substrate bottlenecks, that's not hurting the small nodes, it's hurting the big ones.
 
Removing crypto from the picture we still have supply chain issues, worker shortages, inflation, and increasing costs in chips as the node process shrinks. Should things get back on track in 2023, $1500 for a 4080Ti is the absolute best case scenario I'd expect. But realistically I'm expecting something more along the lines of $1800 MSRP for one.

Seeing the internal state of the USA also leads me to believe we are beyond peak consumerism - as in, we are not getting cheap stuff at the expense of the rest of the world much longer.
 
As long as fab capacity is constrained Intel is not going to make any difference. Intel is using TSMC, not its own fabs for GPU.
Intel is building 2 new factiories in Arizona so maybe they will double or triple the capacity.

 
Intel is building 2 new factiories in Arizona so maybe they will double or triple the capacity.


For more 13.9nm +++++++++

Honestly i'd still put a 400W mid-range GPU in my PC if they could keep the bloody price down.
 
Intel Arc 512/16GB and 384/12GB are expected to launch in March next year, to compete with the 3070 series and 3060 series, respectively.
And does that include a robust program to assure compatibility with games and other applications benefiting from a GPU? You might think that if Intel were working with lots of game software companies on this issue, then we would have heard about it already. NDAs are never 100% airtight.
Seeing the internal state of the USA also leads me to believe we are beyond peak consumerism - as in, we are not getting cheap stuff at the expense of the rest of the world much longer.
The US economy is more wobby than anyone in the government wants to admit. Still high unemployment, now inflation, and persistent supply chain issues. I'm not sure that the government has a grip on all that. But "culture wars" surely are avoiding all these issues.
 
Last edited:
Pretty sure Nvidea doesn't care if you can get a card or not. They are selling everything they have in stock at a killing.

You can't get one? Oh well. Sub to Geforce now.

It's a win-win for them.
 
Economically, lack of supply can cause people to move to alternatives, even radical ones (like not gaming at all, for example).

So, a boon for Nvidia and AMD for now, but if it goes on, eventually gaming will be left to the very elite and even then, a small portion thereof.

But, things will probably correct before that occurs (hoping).
 
Back
Top