Gamers Promised Better Graphics Thanks To Nano Technology Breakthrough

rgMekanic

[H]ard|News
Joined
May 13, 2013
Messages
6,943
A new technology from the Australian National University promises ultra-fast rendering on gaming consoles. Senior ANU researcher Professor Dragomir Neshev says that currently, graphics are being "bottlenecked" by the copper wires used to transmit the data. So an international team of scientists have designed a tiny antenna, 100 times thinner than a human hair to transmit data between the processors in a gaming console.

So that's the problem. Stop upgrading now, it isn't your hardware, it's the 3" of copper traces between your CPU and GPU. I'm honestly at a loss for words about every part of this article.

In a similar way as the NBN (National Broadband Network), one can increase the data transmissions to improve the communication between the different chips on the computer, so we can see better high-resolution rendering and in general a better gaming experience.
 
It might be that the issue with the article is the article itself and not the technology, it seems it might be a new optical interconnect.
 
don't pander to gamers to push interest in science knowing damn well that interconnect bandwidth is nowhere near being the main bottleneck facing console's inability to have better performance.

custom cpu-apu setups and shared system / gpu memory mean consoles aren't really starved for bandwidth and going forward HBM2 using plain old copper interconnects becomes a option ( although a very expensive option).

No, consoles are held back buy price constraints and are not the market for high-end interconnect innovations and they are never going to be running on the most cutting edge process tech anyway.

by the time consoles have 7nm CPUs and GPUs desktop components might start to need a better interconnect fabric but it isn't accurate to say consoles need a this tech to move graphic forward. consoles need UHD 4K, HDR and variable refresh screen compatibility and that will be possible soon enough as the standard on high-end PCs without and moving to optical or RF interconnects.
 
If they could go all optical that'd be cool. Not sure how I feel about RF and potential interference. Speed of electricity in copper is anywhere from .7 to .97 speed of light (varies based on several considerations). At distances in a computer or console, honestly signal propagation delay is a problem even lower on the totem poll than interconnect bandwidth, and as mentioned above, that's not even really a problem.
 
How much of a delay are we really seeing between GPU/CPU/MEMORY/OUTPUT ? Anyone know the real times that we are dealing with in which they would be improving on? I don't see how a computer would be any different than current generation console platforms. Would be nice to have a reference to start the discussion other than then .7-.97 speed of light. Also number of times a gpu is querried from the cpu to perform work in a minute would be helpful.
 
So we can expect interference across the monitor whenever we get a call on a cell-phone or start up the microwave just like on old televisions when power hungry appliances were used? That's awesome!!
 
Great, so all I need to do is get ahold of this tiny antenna & my 386 can run Crysis?......Sweet! :cool:
 
Inductance and capacitance does impact max bandwidth on a copper trace. Don't know if current chips/boards are anywhere near that limit.

Given some of the mixed terminology in the article, have to wonder if it is a bunch of BS designed to induce some eager venture capital investments so the authors can try to implement a solution only to announce they have run into a insurmountable problem, declare bankruptcy and retire on the millions they paid themselves in salary during the effort.
 
If the travel time is added to the instructions being processed (time) before the next instance of being forced out over copper, I think removing the travel time would be good.

I wonder how large the physical communication footprint is on total time of execution? 5% 20%?
 
A new technology from the Australian National University promises ultra-fast rendering on gaming consoles. Senior ANU researcher Professor Dragomir Neshev says that currently, graphics are being "bottlenecked" by the copper wires used to transmit the data. So an international team of scientists have designed a tiny antenna, 100 times thinner than a human hair to transmit data between the processors in a gaming console.

So that's the problem. Stop upgrading now, it isn't your hardware, it's the 3" of copper traces between your CPU and GPU. I'm honestly at a loss for words about every part of this article.

In a similar way as the NBN (National Broadband Network), one can increase the data transmissions to improve the communication between the different chips on the computer, so we can see better high-resolution rendering and in general a better gaming experience.

This is partially true. Intel gave up on copper inter connects for germanium alloys to speed up connections.

However the bus between memory and the CPU has rarely ever been the bottleneck even on consoles.
 
Quantum computing really can't get here fast enough...

I can't wait to see the adoption rate of Quantum computing for games. I realistically don't see it happening in the next 30 years. We've been stuck on a realistic maximum of 3 threads for a decade now. Lol.
 
nano technology will transmit the data faster than compute? big deal compute is now bottleneck. Thus reasoning the cost alone (smirks) why we are still on copper. (which provides HIGHLY conductive and offers minimal resistance right? This bit of info seems to lack anything but buzzwords and for me, quite a bit of logical information to digest.
 
Maybe someone should tell them that electricity travels at the speed of light (it's a wave). The electricity generated at your nearest power plant traveled at the speed of light to your house. The copper traces on the circuit board are not slow.
 
Reading that article was fucking painful. It was like listening to my Grandma try to explain a new technology she heard about from a friend on Facebook.
 
Interconnects are a hugh deal no matter the application. Otherwise, why bother having specifications and distances all mapped out for stuff on a computer motherboard? Memory interconnects, PCIe interconnects, interconnects off the cpu and gpu, they are all super critical to the performance of a device, regardless of the device they are in. *Smirks* The complaints are why we cannot have nice advanced technology, such as hoverboards, flying cars and speaking to whatever it is we need to have function.

We have advanced very little in the last 25 years, other than to make things smaller, prettier and faster. Otherwise, it is all the same as it was.
 
nano technology will transmit the data faster than compute? big deal compute is now bottleneck. Thus reasoning the cost alone (smirks) why we are still on copper. (which provides HIGHLY conductive and offers minimal resistance right? This bit of info seems to lack anything but buzzwords and for me, quite a bit of logical information to digest.

This is true. And silver is even better than that. HOWEVER the problem is when you go from a semi conductor to copper, it causes conductivity problems and heating of the merge point because the metals have different carry potential. This is why intel with with germanium interconnects.
 
That article was terrible to read and seems to have missed the point of what they were proposing.

Optical has been looked at for future PCIE and other interconnect signaling. I didn't dig into the article that much, but what they are likely proposing is transmitting multiple signals over the optical link at different wavelengths. Similar technology to multi-mode fiber and coaxial cable with many channels. It keeps devices electrically isolated, ideally simplifies the traces, and in turn provides far more bandwidth. The concern is being able to actually send/receive the data at the rates being discussed with relatively low power devices on chips while being able to decode the channels in a traditional fabrication process.
 
This is true. And silver is even better than that. HOWEVER the problem is when you go from a semi conductor to copper, it causes conductivity problems and heating of the merge point because the metals have different carry potential. This is why intel with with germanium interconnects.

Yep. And differing metala have differing leak, drain and capacitance issues where an optical or RF interconnect would not. This affects operating power immensely, and dramatically lowers bus speeds due to power. Look at the immense drain of GDDR5 or for thst matter HBM interconnects. The interface alone, at that bandwidth, costs a relative fortune of TDP.

The article is grossly underresearched and misleading, but the speed of light has nothing to do with bus speeds. Lower heating, lower power draw = higher bandwith or more packets.
 
This is true. And silver is even better than that. HOWEVER the problem is when you go from a semi conductor to copper, it causes conductivity problems and heating of the merge point because the metals have different carry potential. This is why intel with with germanium interconnects.
Even so, this data works through an interconnect that is... wireless? ummm...? please explain how this can travel other than via magnetic poles as this "REBELS" against my reasoning of physics please. As electrons and most teeny particles are "chaotic" in a natural state charged or not, they need to be shuttled properly.. How could this work like australia's "The National Broadband Network (NBN) is an Australian national wholesale open-access data network project with both wired, and radio communication components being rolled out and operated by NBN Co Limited (nbn™)" (wikipedia definition) :) Further reading into the "network", THEY ARENT EVEN ON FIBER YET.... This whole article is full of holes.
 
Last edited:
Even so, this data works through an interconnect that is... wireless? ummm...? please explain how this can travel other than via magnetic poles as this "REBELS" against my reasoning of physics please. As electrons and most teeny particles are "chaotic" in a natural state charged or not, they need to be shuttled properly.. How could this work like australia's "The National Broadband Network (NBN) is an Australian national wholesale open-access data network project with both wired, and radio communication components being rolled out and operated by NBN Co Limited (nbn™)" (wikipedia definition) :) Further reading into the "network", THEY ARENT EVEN ON FIBER YET.... This whole article is full of holes.

The tech in the article specify wired connections, the word antenna I assume is in reference to some mechanism to detect photons/light in the interconnect.
 
The tech in the article specify wired connections, the word antenna I assume is in reference to some mechanism to detect photons/light in the interconnect.
So this, "nanotechnology" https://en.wikipedia.org/wiki/Nanotechnology Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, microfabrication, molecular engineering, etc.[4] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly, from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale. Which btw is VERY expensive research, wants to be fueled by gaming too? I dont see how nanotech has anything with regards to speeding up anything even controlled wirelessly withing an interconnect. It still comes back to the same point. nano technology will transmit the data faster than compute? big deal compute is now bottleneck.

Here's some more stuff to keep people wearing tin hats.. https://en.wikipedia.org/wiki/Grey_goo

Grey goo (also spelled gray goo) is a hypothetical end-of-the-world scenario involving molecular nanotechnology in which out-of-control self-replicating robots consume all biomass on Earth while building more of themselves,[1][2] a scenario that has been called ecophagy ("eating the environment", more literally "eating the habitation").[3] The original idea assumed machines were designed to have this capability, while popularizations have assumed that machines might somehow gain this capability by accident.

Self-replicating machines of the macroscopic variety were originally described by mathematician John von Neumann, and are sometimes referred to as von Neumann machines or clanking replicators. The term gray goo was coined by nanotechnology pioneer Eric Drexler in his 1986 book Engines of Creation.[4] In 2004 he stated, "I wish I had never used the term 'gray goo'."[5] Engines of Creation mentions "gray goo" in two paragraphs and a note, while the popularized idea of gray goo was first publicized in a mass-circulation magazine, Omni, in November 1986.[6]

I'd put the doomsday clock up a whole minute with this tech ready to be hacked in it's infancy. And you KNOW there's an idiot just waiting to to manipulate it... Anyways, here's more information to digest.. via market promotion.

 
Last edited:
Back
Top