hardware video transcode on sandy bridge?

Valset

Supreme [H]ardness
Joined
Apr 4, 2008
Messages
6,533
If this should go somewhere please move it. Just read about this at SA. As its a pretty obvious anti Nvidia article I was hoping someone somewhere else may have heard about it? it seems a bit far fetched to me just to Intel to implement it given how valuable die space is. then again it would make for some awesome benchmarks as video encoding is one of the bigger benchmarks that people use.

edit, here is the link if anyone is interested: http://www.semiaccurate.com/2010/09/11/sandy-bridge-has-dark-secret/
 
Last edited:
If this should go somewhere please move it. Just read about this at SA. As its a pretty obvious anti Nvidia article I was hoping someone somewhere else may have heard about it? it seems a bit far fetched to me just to Intel to implement it given how valuable die space is. then again it would make for some awesome benchmarks as video encoding is one of the bigger benchmarks that people use.

edit, here is the link if anyone is interested: http://www.semiaccurate.com/2010/09/11/sandy-bridge-has-dark-secret/

IDF is going on right now and yes SB can do it. See, e.g.: http://www.anandtech.com/show/3916/intel-demos-sandy-bridge-shows-off-video-transcode-engine Hell, the title of that article is currently: "Intel Demos Sandy Bridge, Shows off Video Transcode Engine." I would pair a SB CPU with a high-end GPU that transcodes even faster, so I personally don't care that much, but I could see SB being enough for many users who only do some video editing and playing back movies, without actually gaming.
 
Not a bad feature, so long as it's not locked into a single codec (although h.264 wouldn't be a bad choice at this point). GPU-based transcoders have the advantage of being software, so you can reconfigure them when a new codec becomes popular.

I can't wait to get my hands on a Sandy Bridge CPU :)
 
IDF is going on right now and yes SB can do it. See, e.g.: http://www.anandtech.com/show/3916/intel-demos-sandy-bridge-shows-off-video-transcode-engine Hell, the title of that article is currently: "Intel Demos Sandy Bridge, Shows off Video Transcode Engine." I would pair a SB CPU with a high-end GPU that transcodes even faster, so I personally don't care that much, but I could see SB being enough for many users who only do some video editing and playing back movies, without actually gaming.

thanks, that is what I was looking for. No wonder charlie was dancing with glee. as far as the consumer market is concerned this is a pretty sharp stick up Nvidia's ass on the cuda side

edit to add to that boxee just kicked Nvidia to the curb for an intel SOC. maybe opening that can of woop ass was not advisable. intel did not take kindly to it
 
Last edited:
thanks, that is what I was looking for. No wonder charlie was dancing with glee. as far as the consumer market is concerned this is a pretty sharp stick up Nvidia's ass on the cuda side

edit to add to that boxee just kicked Nvidia to the curb for an intel SOC. maybe opening that can of woop ass was not advisable. intel did not take kindly to it

Yeah I saw that too, Tegra 2 may end up winning designs early on but then losing the sale, much as Charlie predicted months ago (albeit for different reasons). Charlie isn't a bad source of info, because he is well-connected in the industry; you've just got to account for his bias.

Edited to add: There will be dedicated logic for Blu-Ray decoding, and the SB GPU does NOT run DX11. Not that such a slow GPU is likely to run games that could take advantage of DX11-only features like tessellation anyway. http://www.pcmag.com/article2/0,2817,2369092,00.asp
 
Last edited:
IDF is going on right now and yes SB can do it. See, e.g.: http://www.anandtech.com/show/3916/intel-demos-sandy-bridge-shows-off-video-transcode-engine Hell, the title of that article is currently: "Intel Demos Sandy Bridge, Shows off Video Transcode Engine." I would pair a SB CPU with a high-end GPU that transcodes even faster, so I personally don't care that much, but I could see SB being enough for many users who only do some video editing and playing back movies, without actually gaming.

"We also got confirmation that Sandy Bridge will support both CPU and GPU turbo modes."

What's the point? :p
 
"We also got confirmation that Sandy Bridge will support both CPU and GPU turbo modes."

What's the point? :p

They are independent turbo modes so it won't hurt you. Basically the combined CPU and GPU TDP can't be over the specifications for more than a minute or so. If you have a discrete video card then the GPU will presumably be turned off and thus won't drag down the Sandy Bridge total TDP. For those without discrete video cards, the GPU can get a nice boost so long as the CPUs aren't very active. SB Turbo boost is great and a reason why I am waiting for SB and Ivy Bridge. (Earlier versions of turbo boost weren't as effective.)
 
They are independent turbo modes so it won't hurt you. Basically the combined CPU and GPU TDP can't be over the specifications for more than a minute or so. If you have a discrete video card then the GPU will presumably be turned off and thus won't drag down the Sandy Bridge total TDP. For those without discrete video cards, the GPU can get a nice boost so long as the CPUs aren't very active. SB Turbo boost is great and a reason why I am waiting for SB and Ivy Bridge. (Earlier versions of turbo boost weren't as effective.)

Yes, but for us enthusiasts over clocked crap is still just that, crap. Granted given by the previews at Anand they are actually decent for once for the [H] demographic it is still pretty much irrelevant.
 
Charlie is excited about something but I just don't get it.

GPU assisted Transcoding has so far been crap. X264 is still the King of Quality transcoding and that isn't about to change anytime soon.

And assuming SB has something new about it. Well it is just the same as Nvidia as AMD wont use it and you will have the fragmented situation all over again.

Then again anything that harms the liars and cheaters buisness at Nvidia is good for tech.
 
Charlie is excited about something but I just don't get it.

GPU assisted Transcoding has so far been crap. X264 is still the King of Quality transcoding and that isn't about to change anytime soon.

And assuming SB has something new about it. Well it is just the same as Nvidia as AMD wont use it and you will have the fragmented situation all over again.

Then again anything that harms the liars and cheaters buisness at Nvidia is good for tech.

Well Nvidia is not doing good so that explains charlies glee to start with but this in particular it that it removes one of what Nvidia has actually been successful, at least in marketing. even on the lower end hardware and integrated graphics. whether or not we (that is you and me along with the H crowd) think of it isn't the issue. It has been to date the only real consumer cuda application that's gotten anywhere. (read its the only that is actually useful to the common user, or can actually be used as a real selling point) so now intel comes along (maybe remembering that can of whoop ass) and decides that they need to look better in bench marks because their next gen cpu doesn't actually do anything the last one did sans being a little faster. and trans coding is important benchmark. so they look better and and add features AND shove a sharp stick up Mr. Whoop Ass. so Nvidia is now having to try market something that people will get for free with their new faster cpu.

Just my take on it. it just kills a couple of years of market push by Nvidia. never insult someone whose neck is bigger then your head

edit for cliffs
GPU trans coding may not work good in practice but looked good on marketing and Intel just killed that for Nvidia
 
GPU assisted Transcoding has so far been crap. X264 is still the King of Quality transcoding and that isn't about to change anytime soon.

And, once transcode assist becomes a part of all Intel processors sold (80% of the x86 market), suddenly the folks maintaining X264 might be enticed to use this acceleration feature. This is assuming that the feature is flexible enough to be tweaked for the highest quality.

Accelerated transcoding is not inherently lacking in quality, it's just that the early tools are more focused on maximum performance rather than maximum quality.
 
INTEL integrated graphics always sucked & always will compared to discrete graphics & drives up the heat on the chipset...it's great for cheap systems but never good for overclocking & performance...
 
Back
Top