Nintendo's backward compatibility Problem... | MVG

The NES classic is just an emulation machine. All the classic NES, SNES, N64 games are running on emulators on the Switch.
Nintendo-NES-Classic-Edition-Entertainment-System-Console.jpg


I'm sure anyone can, but is it because it looks worse on original hardware or because of some magically experience that you can only feel?

Again, you say this but what evidence do you have besides feels?

If it's emulating the transistors then that would explain why it's really slow. Also the end result won't change, just the frames per second. This argument breaks down quickly when you discuss 3D rendering as newer hardware is just fantastically better at it.

subjective feeling is important, and it's just my opinion that emulation blows

on the last point about the transistor level implementation, this is referred to as Simulation

it's nearer to hardware simulation done on big machines that nvidia / amd and others use from Cadendance and Synopsys, etc

if it was merely "emulation" then that'd be an unreliable means of design verification for their ASICs, but no it's Simulation at the transistor level
 
Doesn't matter, it matters how the code is built. If the code is built such that it is highly architecture specific, it won't run on the new system. Try to make a maxwell only nVidia driver work for an ampere chip, it won't happen. The games would have to be ported. Could the be? Of course, but it would require a rebuild, presuming the binaries are actually how he says they are which I can believe because:
This is solvable in a couple of ways.

At the least, Nvidia could develop a translation layer. Tegra is their hardware, so they know all of the registers, etc. So yeah, at the least, they could translate it over to a newer chip. And the newer hardware will be so much faster, that the overhead shouldn't keep Switch games from at least running as well as Native. And should probably even have room for performance improvements.

Even better, Nvidia could do like AMD did for PS5, and build a new, custom chip. Which includes a hardware fallback mode, for compatibility with software meant for previous hardware.

It remains to be seen if the next Nintendo console will have a more/less off the shelf solution, like the Switch did. Or if they will work with Nvidia to create a new, custom piece. With the massive success of Switch. As well as industry and editorial buzz about Steam Deck: I would posit that Nintendo will more/less make a Switch 2. And Nvidia would probably be more willing to do something custom for that. As opposed to a wildly different console or handheld idea.
Nitnendo, Microsoft, and Sony have all done emulation and not with always great results.
I think its probably due to a couple of things:

1. Lack of internal interest in really investing in it

2. Exisiting talent at these companies may not be great at doing emulation. But, related to point #1, these companies may not be interested in seeking talent which could single-handedly improve their internal emulation efforts. Microsoft of course is the exception on this. As their efforts to make Xbox360 games compatible with newer hardware, was pretty fantastic.
 
Even better, Nvidia could do like AMD did for PS5, and build a new, custom chip. Which includes a hardware fallback mode, for compatibility with software meant for previous hardware.
It sort of is a custom chip, because of the Nvidia hack we can all be relatively sure it's using the Nvidia Tegra T239 chip this is a variation of the NGX Orion TE980-M but they take the CPU configuration from 8,1,4 down to just 8, then use the extra juice to clock up the GPU side.
That is the same tweak used by the Tegra X1 variant in the switch where they took the 4,4 CPU configuration and disabled the 4 A53 cores to put the saved power into clock speed increases which is why the Switch SOC clocks higher than the Jetson Nano kits.
It's not guaranteed that is the chip Nintendo is using but it was one of the more interesting things learned from the Nvidia hack.
 
The funny thing about these comparisons is that literally 100% of the time I prefer the look of the sharp edged pixels over the blurry mess that used to be CRTs.

CRTs were awful then, CRTs are even more awful now.
even the GDM-FW900 Trinitron monitor ??

Dlyrn1iWwAYVvTc.jpg


u2ncxv0nfik81.jpg
 
Trying out new gimmicks.

Seriously, Nintendo has ALWAYS been a gimmick company, sometimes it has worked, sometimes it has failed. The Switch is an example of a gimmick that worked: A portable game system you can dock and make a TV game system. The Wii U is an example of a gimmick that failed: A tablet in your controller. But this goes all the way back to the OG NES and its light gun, robot, power glove, and so on. They just love trying out new gimmicks and do so each generation.

While the smart money would be to make a Switch 2 that is just a next gen, more powerful, better, handheld I could totally see them not doing that and instead doing something radically different just because that is how Nintendo do.

They also don't plan well for this shit, as is indicated by this video. Smart design these days is to abstract your hardware from your software, even on an embedded device. This means if you want to, or have to, change the hardware later you can and it isn't a big deal. It also easily allows for things like backwards compatibility. It isn't hard to do with a modern embedded OS either. However, Nintendo didn't, and you have binaries tied to this specific chip as a result.


It's not a matter of the OS forbidding it, but rather them using DOS-era coding practices as the video talked about. So basically each game binary is statically linked in with the graphics drivers, and as such compiled for the specific graphics chip. It's not compatible with a new chip, even if fairly similar in the same way an old GPU driver won't work on a new GPU. This is similar to how DOS games did it back in the SVGA days before VESA. There was no abstraction layer so you had to code the game to any graphics chips you wanted to support. A new chip wouldn't work with the game unless it got redone to support it.

Blows my mind that anyone still does things this way in 2023.

Maybe they could use some sort of OS level emulation?

(If they have enough extra power to pull it off, that is)
 
Yep.
100% of the time.
Fuck CRTs. Heavy, Blurry, flickering bullshit. All of em.
Always have been.
heh, to each their own, and while I do enjoy modern games from the last 20+ years to have 1:1 pixels, I wouldn't prefer it for games from the 1990s and earlier.
For classic low-res games from the 8-bit to early 64-bit era CRT TVs were the best, and the scanline interleave 'filter' effect made games look so much better back in the day than they do with 1:1 pixel displays using emulation.
 
The funny thing about these comparisons is that literally 100% of the time I prefer the look of the sharp edged pixels over the blurry mess that used to be CRTs.

CRTs were awful then, CRTs are even more awful now.
I agree most of the time.

There are a handful of old titles that a Tually look better on real CRT's (especially those that used the CGA additional colors mode that only worked on TV's) but yeah.

There was an argument for CRT's being superior before the industry adopted G/Free sync. There isn't anymore.
even the GDM-FW900 Trinitron monitor ??

View attachment 554540

View attachment 554541

No, not even with the GDM-FW900.

That, and I don't think I ever want to experience the CRT tan / hot ears ever again.

The only exception would probably be if you are playing old 8bit titles.
 
I agree most of the time.

There are a handful of old titles that a Tually look better on real CRT's (especially those that used the CGA additional colors mode that only worked on TV's) but yeah.

There was an argument for CRT's being superior before the industry adopted G/Free sync. There isn't anymore.


No, not even with the GDM-FW900.

That, and I don't think I ever want to experience the CRT tan / hot ears ever again.

The only exception would probably be if you are playing old 8bit titles.
what about Plasma
 
what about Plasma
I still have my LG 7'th Gen Plasma, and it is a goddamned rock star!
Yeah it doesn't have 4K, and because of Plasma's timings being different that LCD it makes using devices with cheap or crappy HDMI implementation not work right (the image shifts) but for most content (not high FPS gaming), it still looks great.
 
I still have my LG 7'th Gen Plasma, and it is a goddamned rock star!
Yeah it doesn't have 4K, and because of Plasma's timings being different that LCD it makes using devices with cheap or crappy HDMI implementation not work right (the image shifts) but for most content (not high FPS gaming), it still looks great.

I liked my 65" Panasonic plasma as a home theater screen, but I don't think I would have used it for anything else.

I had it from 2012 to 2021, and then gave it away because it was just taking up space.

I was planning on setting up my home theater system with a new large LG OLED screen, but I just haven't gotten around to it yet, as the room in the finished basement I want it in is still full of crap from when we moved in, that I don't know what to do with :p
 
I liked my 65" Panasonic plasma as a home theater screen, but I don't think I would have used it for anything else.

I had it from 2012 to 2021, and then gave it away because it was just taking up space.

I was planning on setting up my home theater system with a new large LG OLED screen, but I just haven't gotten around to it yet, as the room in the finished basement I want it in is still full of crap from when we moved in, that I don't know what to do with :p
For movies and TV Plasma looks great, and color accuracy and brightness is excellent, especially for the average living room where you have windows and adjoining rooms and such. It works fine for console gaming, but PC gaming, not a chance, and my ODroid N2+ hates it.
But for video content it is king, need to spend a lot of money to buy an OLED that looks better than it overall for that, and the Wife isn't interested, and since it's wall-mounted it is there until she says so because I need the help to take it down... it is not lite.
 
CRT were crap then. To me currently using a CRT is like a hipster thing. Current tech has so many advantages over CRT that is crazy to think anyone would use one over something like a OLED. I find my old school SNES and even NES games look better on my 65" OLED then my dinky little 17" I used as a kid. Better then even the 36" Sony Trinitron TV I got eventually.
 
CRT were crap then. To me currently using a CRT is like a hipster thing. Current tech has so many advantages over CRT that is crazy to think anyone would use one over something like a OLED. I find my old school SNES and even NES games look better on my 65" OLED then my dinky little 17" I used as a kid. Better then even the 36" Sony Trinitron TV I got eventually.

To be fair, early flat panels really sucked.

I too preferred CRT's compared to those god awful 90's panels.

Input lag, ghosting, you name it.

But LCD screens have come a long long way since then, and are far superior today.
 
To be fair, early flat panels really sucked.

I too preferred CRT's compared to those god awful 90's panels.

Input lag, ghosting, you name it.

But LCD screens have come a long long way since then, and are far superior today.
I had an early LG one and it was beautiful, for word processing and development. You could game but only with VSYNC enabled and frame locked to 30fps but it fit in my apartment which was barely 15'x12' so that was the deciding factor there.
 
I had an early LG one and it was beautiful, for word processing and development. You could game but only with VSYNC enabled and frame locked to 30fps but it fit in my apartment which was barely 15'x12' so that was the deciding factor there.

Hmm. That certainly wasn't my experience, but I also didn't try every flat panel in the early days.

I recall liking them for desktop work, especially since they didn't exhibit the flicker at lowwr refresh settings like CRT's did, but I absolutely hated them for anything that involved movement like games.

The first flat panel I didn't hate in games was my 1920x1200 Dell 2405FPW I bought in like 2005 or so.

I was really biased against flat panels for gaming back then, but I bought it after my old 22" Iiyama VisionMaster Pro 510 which I had spent like $1000 on in 2001 randomly died on me in 2005.

I was 25 back then, and had this notion that "I'm out of college, I'm an adult now, no way I'm going to be interested in games anymore"  so I decided to try a widescreen LCD and this Dell screen reviewed well at the time, using the same panel as Apples screens at a fraction of the price.

To my surprise it actually wasn't terrible with games. They had improved a lot since the terrible little LCD's of the 90's. It wasn't as good as my Iiyama CRT had been, but it was - IMHO - totally usable.

Well, mostly. My 6800GT was woefully inadequate for 1920x1200 gaming. That and me being baby trying to be an adult more or less caused me to withdraw from the PC hobby for almost 5 years. I didn't pick it back up again until late 2009.

In 2010 I made the same mistake again, getting a new high resolution 30" 2560x1600 Dell U3011, and then struggled with finding a GPU that could handle it until I picked up the first Titan in 2013, but it was fine as I was mostly playing older titles I had missed in ym 5 year absence :p

I found the U3011 to be fine at the time, but in retrospect I don't think I would be happy with anything lacking in VRR / Free/G-Sync these days.
 
Hmm. That certainly wasn't my experience, but I also didn't try every flat panel in the early days.

I recall liking them for desktop work, especially since they didn't exhibit the flicker at lowwr refresh settings like CRT's did, but I absolutely hated them for anything that involved movement like games.

The first flat panel I didn't hate in games was my 1920x1200 Dell 2405FPW I bought in like 2005 or so.

I was really biased against flat panels for gaming back then, but I bought it after my old 22" Iiyama VisionMaster Pro 510 which I had spent like $1000 on in 2001 randomly died on me in 2005.

I was 25 back then, and had this notion that "I'm out of college, I'm an adult now, no way I'm going to be interested in games anymore"  so I decided to try a widescreen LCD and this Dell screen reviewed well at the time, using the same panel as Apples screens at a fraction of the price.

To my surprise it actually wasn't terrible with games. They had improved a lot since the terrible little LCD's of the 90's. It wasn't as good as my Iiyama CRT had been, but it was - IMHO - totally usable.

Well, mostly. My 6800GT was woefully inadequate for 1920x1200 gaming. That and me being baby trying to be an adult more or less caused me to withdraw from the PC hobby for almost 5 years. I didn't pick it back up again until late 2009.

In 2010 I made the same mistake again, getting a new high resolution 30" 2560x1600 Dell U3011, and then struggled with finding a GPU that could handle it until I picked up the first Titan in 2013, but it was fine as I was mostly playing older titles I had missed in ym 5 year absence :p

I found the U3011 to be fine at the time, but in retrospect I don't think I would be happy with anything lacking in VRR / Free/G-Sync these days.
Work was my primary deciding factor simple 800x600 made my coding work easy with no headaches it was a thing of beauty at the time 2003'ish, but built for gaming it was not. I made it work, but it was certainly not optimal, to say the least.
 
I still have my LG 7'th Gen Plasma, and it is a goddamned rock star!
Yeah it doesn't have 4K, and because of Plasma's timings being different that LCD it makes using devices with cheap or crappy HDMI implementation not work right (the image shifts) but for most content (not high FPS gaming), it still looks great.
See? ^^^ Zarathustra[H] KarateBob
 
That's because LCD/LED are more accurate and thus you see the detail too well. Most emulators have filters on them to simulate that CRT look, and then some.

The artwork was designed specifically for that type of rendering. So one could argue that the true artist intent is more accurately displayed in old analogue screens.

I can get behind that, but modern artwork designed for LCD grids are not best presented on those old CRTs.
 
The artwork was designed specifically for that type of rendering. So one could argue that the true artist intent is more accurately displayed in old analogue screens.

I can get behind that, but modern artwork designed for LCD grids are not best presented on those old CRTs.
Shaders can mimic the effect if need be. Me personally I'd rather use something that enhances the image.



KGs8lMC8lvo.jpg
 
Actually, emulation does look poorer on LCD/LED monitors compared to the original hardware on CRT TVs back in the day:

View attachment 554505

View attachment 554506
View attachment 554507

View attachment 554508
Yes that is why you play with a particular emulator/front end. I use GameEx for my cabinet because it makes it proper. No real old skool gamer would choose garbage pixel actuate bull shit for their emulator. Unless you're young and don't know better or just young and stupidz.
 
Shaders can mimic the effect if need be. Me personally I'd rather use something that enhances the image.



View attachment 554828

Was going to say. The sharp appearance of pixels on emulators is an odd thing to complain about when literally every emulator out there gives you all kinds of options to blur it up as much as you want, and even simulate the look of a CRT. I also prefer upscaling the image for a cleaner look to making it harder on the eyes.
 
Back
Top