Whats the big deal with retina display on iPad?

I agree. It's a pretty impressive screen without a doubt, but I think 1920x1440 would have been a more useful resolution for media consumption, which if I'm not mistaken is the primary purpose of the display.

Seems they felt the need to bump it up a bit so they could have "higher than HD" resolution and the highest on the market, as there are already several Android tablets with 1920x1080 screens.

These 1080p screens will do a better job of displaying (widely available) 1080p video than the Retina display because they won't have to scale or letterbox it, but the target audience for the iPad tends not to give a crap about things like that. The new iPad will primarily be compared to the old iPad, rather than it's current competition.
 
Seems they felt the need to bump it up a bit so they could have "higher than HD" resolution and the highest on the market, as there are already several Android tablets with 1920x1080 screens.

What Android 1920x1080 tablets can you buy today? None.

It is pretty obvious why it is 2048x1536. Divide by 2 for a hint.

These 1080p screens will do a better job of displaying (widely available) 1080p video.

For a device that you use primarily at home where you likely have an HDTV, setting the aspect ratio for video shouldn't be the primary concern. 16:9 is really a poor tablet aspect since in portrait it is really too narrow. 4:3 is better for Web/Reading which are likely to be more used activities.
 
What Android 1920x1080 tablets can you buy today? None.

It is pretty obvious why it is 2048x1536. Divide by 2 for a hint.



For a device that you use primarily at home where you likely have an HDTV, setting the aspect ratio for video shouldn't be the primary concern. 16:9 is really a poor tablet aspect since in portrait it is really too narrow. 4:3 is better for Web/Reading which are likely to be more used activities.

For both the media viewing and web surfing I liked the 16:9 over the 4:3 of the iPad, guess its just personal opinion.
 
In all fairness, I suppose 2048x1536 is the standard for 2K resolution. And no...there are no 1080p Android tablets yet.
 
True, they're all in the pipeline for Q2 (Iconia Tab A700, Transformer Prime, etc). Basically I was making the point that the resolution isn't anything special. Not that special anyway.

I didn't suggest a 16:9 resolution, I suggested 1920x1440, which is a 4:3 resolution, incidentally the same one I use on my computer monitor. I think plenty of people will use the iPad for HD video, as Apple is touting it's abilities in this field. Plenty of their target market will happily sit on a couch in front of a giant HDTV and watch a movie on their iPad.

Bitmap based UIs are stupid and inefficient. Needing to scale the old display by factors of 2 in order to maintain UI consistency is clearly unsustainable. They should be able to scale the UI to any display resolution that is equal to or higher than the current one. Android also needs to work on this, especially with the range of hardware it expects to support.
 
Basically I was making the point that the resolution isn't anything special. Not that special anyway.

It is quite special today, totally outclassing the competition to an extreme degree. Who knows when competing tablets from manufacturers, who sometimes pre-announce and never deliver, will finally deliver something even in the ballpark.

I didn't suggest a 16:9 resolution, I suggested 1920x1440, which is a 4:3 resolution, incidentally the same one I use on my computer monitor. I think plenty of people will use the iPad for HD video, as Apple is touting it's abilities in this field. Plenty of their target market will happily sit on a couch in front of a giant HDTV and watch a movie on their iPad.

Why would anyone choose to watch a movie on Tablet while sitting in front in front of an HDTV. That is just ridiculous. Sure people will watch movies on tablets, but it will usually be when they can't access a bigger screen.

Also you seem to be laboring under the naive assumption that video needs to run at native resolution. Video is extremely friendly for scaling. Locking a general purpose device into a 1:1 relationship to video resolutions, would be pointless and asinine.

Bitmap based UIs are stupid and inefficient. Needing to scale the old display by factors of 2 in order to maintain UI consistency is clearly unsustainable.

No, it is pragmatic solution that Apple has proven works very well. Plus you only need do this once. Now that they have moved to the point of essentially invisible pixels, there is little incentive to change it again, unless you build a different screen size.
 
Bitmap based UIs are stupid and inefficient.
I'd have to contest the claim that bitmaps are "inefficient". In terms of rendering, they are fastest to render. Rendering vector data requires that the vectors first be rasterized, then drawn. Rendering bitmap data, on the other hand, spares you a fairly lengthy step in the process.

Assuming vector assets are cached, the only performance hit could be in terms of loading times, but load times are still an issue on the iPad.
 
I didn't suggest a 16:9 resolution, I suggested 1920x1440, which is a 4:3 resolution, incidentally the same one I use on my computer monitor. I think plenty of people will use the iPad for HD video, as Apple is touting it's abilities in this field. Plenty of their target market will happily sit on a couch in front of a giant HDTV and watch a movie on their iPad.

This would require dropping proper scaling with *everything else* just to get pixel perfect HD video. And while I think plenty of people will watch HD video on their iPad, I think they would rather have perfect scaling with the majority of the UI and live with a bit of scaling in video (which most people wouldn't be able to notice, since video takes to scaling extremely well).
 
I disagree with the assertions that video scales well. It's true that it can scale well if there is a clean pixel ratio between source and destination, like 1280x720 to 1920x1080 (2:3 on both axis, a block of 4 pixels at source can be represented by a block of 9 pixels at destination).

Dirty scaling like 1920x1080 to 2048x1152 doesn't have a clean solution, and at least to my eyes this type of scaling always results in a considerable reduction in definition, and considerable artificating. If I were to watch 1080p or 720p video on that display I'd much prefer to have a screen border than a slightly larger version run through a scaler.

As for Apple decision not to use a vector based UI, I think it's got alot more to do with ease of development and making it look nice (because a fully bitmapped UI will win the pretty contest for sure) than it has to do with performance. If the GPU onboard the iPad is anything near as powerful as claimed, there shouldn't be any meaningful performance hit rasterizing vector UI elements. Any 3D game is going to require this to be done on a much larger scale than a UI will. I see the current system running into it's limits because of the insane resolution of the new display, and the resulting unnecessarily bloated software that's coming down the pipe as a result.
 
I disagree with the assertions that video scales well. It's true that it can scale well if there is a clean pixel ratio between source and destination, like 1280x720 to 1920x1080 (2:3 on both axis, a block of 4 pixels at source can be represented by a block of 9 pixels at destination).

Dirty scaling like 1920x1080 to 2048x1152 doesn't have a clean solution, and at least to my eyes this type of scaling always results in a considerable reduction in definition, and considerable artificating. If I were to watch 1080p or 720p video on that display I'd much prefer to have a screen border than a slightly larger version run through a scaler.

This is total nonsense.

Take a video player that resizes on window size, and grab the corner and go nuts. No artifacts and no noticeable difference in quality between small changes that produce your so called "dirty scaling".

Video scaling is trivia that has been perfected years ago.
 
I know exactly what dirty vs clean scaling is, and dirty scaling is far from "perfected". It can be made pretty good if you can throw a ton of processing power at it, but that's not a great option on a portable device. I watch lots of digital video, on a very accurate display, and dirty scaling does produce artificats and a noticeable difference in quality. There are some pretty good software scalers now (Lanczos3) but it does use a ton of cpu cycles to get the job done. Simple bilinear scaling that most players use by default is nowhere near as accurate.
 
I know exactly what dirty vs clean scaling is

You should, since you made it up. ;)

It can be made pretty good if you can throw a ton of processing power at it, but that's not a great option on a portable device. I watch lots of digital video, on a very accurate display, and dirty scaling does produce artificats and a noticeable difference in quality. There are some pretty good software scalers now (Lanczos3) but it does use a ton of cpu cycles to get the job done. Simple bilinear scaling that most players use by default is nowhere near as accurate.

Again, total nonsense. The main scaling algorithm in the industry is Bicubic and it produces smooth clean, artifact free scaling with little computational overhead. Lanczos and other more computationally expensive scaling algorithms are primarily used by tweakers trying to create an HD effect in low res video (like 352p) upscaled to 1080p. It has nothing to do with needing perfect scaling ratios for Bicubic.

There is zero need for exact 2.0 or 1.5 video scaling factors like you state.

Scale some 1920x1080 video to 1280x720 (your magic scaling factor) and 1312x738 (oddball scaling factor) and play them both back on a 1080p device.

There will be no visual detectable difference at all. By your statements the 738p video should have artifacts because it is "dirty"(sic) scaling. This is total nonsense.

I run an HTPC, and capture 1080p video daily, scale it and compress for long term archiving. I have done extensive comparisons of scaling factors and resize algorithms.

Nothing you are saying makes any sense. 1080p video will look amazing on an iPad Retina screen. This is just a silly argument against the Retina screen.

Only an Apple advancement brings out this much sour grapes on [H]. I can't imagine anywhere else where I would see so many "oh no, it has too much resolution" whining. :rolleyes:
 
I know exactly what dirty vs clean scaling is, and dirty scaling is far from "perfected". It can be made pretty good if you can throw a ton of processing power at it, but that's not a great option on a portable device. I watch lots of digital video, on a very accurate display, and dirty scaling does produce artificats and a noticeable difference in quality. There are some pretty good software scalers now (Lanczos3) but it does use a ton of cpu cycles to get the job done. Simple bilinear scaling that most players use by default is nowhere near as accurate.

Yep to me this is a huge problem. You can do a quick experiment to see this issue first hand.

1. Take a screen shot of your display if its 1920x#### ("print screen" for windows users") or just download one off the web
2. Paste image into an image editor like photoshop or paint.
3. Resize the image by pixels for the horizontal from 1920 to 2048 while having aspect ratio locked.

First thing you'll notice that everything is soften by the "dirty" scaling. For images without distinct sharp objects, the "softening" isn't a huge issue as you won't notice it. However, for sharp objects (e.g. lines, text, etc), they'll lose a lot of image quality from the softening.

One possible solution is that the iPad video player allow 1:1 pixel mapping and crop out the edges. This will actually result in the best picture quality at the cost of screen size.
 
First thing you'll notice that everything is soften by the "dirty" scaling. For images without distinct sharp objects, the "softening" isn't a huge issue as you won't notice it. However, for sharp objects (e.g. lines, text, etc), they'll lose a lot of image quality from the softening.

Hence why this is not an issue. We are talking about Video. Not computer interfaces with text/lines.

Do the test I suggest above, on Video, to show that magic scaling factors have absolutely nothing to do with this.

In fact good job explain exactly why Apple did a perfect double on the important elements, the computer interface, while using scaling on the part that handles it best: Video.

It would be moronic to do it the other way around, worry about pixel perfection on the video, then do offset scaling on the interface.
 
You should, since you made it up. ;)
Again, total nonsense. The main scaling algorithm in the industry is Bicubic and it produces smooth clean, artifact free scaling with little computational overhead.

Uhhh, there are artifacts with bicubic scaling, so you're clearly wrong there. Bicubic results in haloing of edges and other sharp details.

Again, you can experience these issue in the experiment that I proposed above since most images editors default with bicubic scaling.
 
Uhhh, there are artifacts with bicubic scaling, so you're clearly wrong there. Bicubic results in haloing of edges and other sharp details.

Again, you can experience these issue in the experiment that I proposed above since most images editors default with bicubic scaling.

Bicubic is tunable. The neutral tuning used as the main default for video, does not produce halos in VIDEO.

Your experiment is irrelevant, because you are talking about an interface screenshot, not video. As I said your "example" supports make my case, as to why Apple went for the perfect double on the interface and left video to perfectly adequate (for video) bicubic scaling.

Do you think it makes more sense to do bicubic scaling on the interface, and then perfect whole number scaling on the video? That would be exactly backwards. Apple made the smart decision here.
 
Bicubic is tunable. The neutral tuning used as the main default for video, does not produce halos in VIDEO.

Your experiment is irrelevant, because you are talking about an interface screenshot, not video. As I said your "example" supports make my case, as to why Apple went for the perfect double on the interface and left video to perfectly adequate (for video) bicubic scaling.

Do you think it makes more sense to do bicubic scaling on the interface, and then perfect whole number scaling on the video? That would be exactly backwards. Apple made the smart decision here.

I don't really understand your angle. The "haloing" is always there since that is how the bicubic interpolation algorithm works for upscaling. You just might not be able to notice the problem since most images don't have sharp edges. (there other cases where haloing is very noticeable, so it doesn't just effect sharp edges. However, many are shape dependent and thus much much rarer)

You can record a perfect line or objects with clear sharp edges using a video camera, and that perfect line and other sharp edges you recorded is going experience very noticeable haloing artifacts with bicubic upscaling.

I recommended using the desktop UI to highlight the issue since the desktop image is at the extreme end of images with sharp edges.

It really boils down to your video source. Sure for probably 80% of the image content, bicubic up scaling will result in few noticeable artifacts. On the extreme end you have video of a person recording their desktop session where image is perfectly focused with clearly defined UI elements. On the other hand you can have a completely out of focus video where you just see blur of lights and objects.

Just to clear things up. Bicubic interpolation is NOT artifact free (fact, you're just wrong here). Bicubic is a very very good non adaptive scaling for most cases where you don't have sharp edges and at the moment is computationally feasible for video.
 
Last edited:
Show me an example in a from a movie video frame. Theoretical, non video, worse case examples serve no real practical purpose. It is like feeding Blue/Red red test patterns to Bayer Cameras. They show the the theoretical breakdown, but that has zero practical application in the real world.

Show a real practical example. Surely you can come up with a real frame of video from a movie to demonstrate these obvious halos?
 
I consider myself quite the video snob and have a 1080P projection theater setup to watch stuff. I always find myself bitching about black levels and poor audio instead of barely noticeable artifacts caused by scaling. I am pretty content with any video 720P and higher, so normally things are getting scaled 720P -> 1080P and I rarely complain about the scaling. Guess everyone has different stuff they fixate on.
 
Uhhh, there are artifacts with bicubic scaling, so you're clearly wrong there. Bicubic results in haloing of edges and other sharp details.
Not so much in applications dealing with video.

I think this is a rather pointless debate anyway. What testing has anyone performed with a third-generation iPad which confirms that any theoretical artefacting or image softness is visually apparent, without magnification, due to the upscaling of 1920x* video to 2048x*? Any? None at all?
 
Dirty scaling like 1920x1080 to 2048x1152 doesn't have a clean solution, and at least to my eyes this type of scaling always results in a considerable reduction in definition, and considerable artificating.


Here is a crop of 1080p video frame. Both the original, and scaled from 1920 to 2048 with bicubic. I took pains to find something stationary and in focus. Most video is soft from focus/motion issues. In this case I found an interesting pattern on the desk, some screen credits (not actual video) and a chair edge that is likely clipping from the reflection. This is about as challenging as real video gets.

I see no artifacts at all, and bear in mind this is leaning in to my 24" screen, closer than I would even use a tablet, on a still image looking for them.

It is utterly ridiculous to suggest there is some issue scaling video from 1920 to 2048, especially for a 10" screen.

http://img855.imageshack.us/img855/6456/vidframe.png

This is really just people who don't really have a clue, making up silly things to moan about.
 
First off, screen size shouldn't be that relevant, as the tablet screen will fill a similar portion of the viewers field of vision as most HDTVs will at normal viewing distance. Give or take.

Second, I'm not sure what you mean by no artificats at all, even on quick inspection I see a number of visual discrepancies between those images. The original doesn't in fact clip on the reflection you mentioned, but the scaled one most definetly does. This is a known effect of bicubic interpolation. The gradient in the chair back is much more blotchy in the scaled version, it's not perfect in the original probably due to compression, but the scaling seems to amplify this loss. In the text I see more haloing than in the original, particularly where the right hand V of the W dips into the darker grey in front of whatever object in on the table. Again, there is some haloing in the original maybe due to lighting, sensor error and/or compression, but it's amplified by scaling. Again this is a known effect of bicubic scaling.

I could probably spend more time picking it apart, but those three things caught my eyes almost immediately on my 24" CRT. I think I have a little more than a clue, and very good eyesight. It's mathematically impossible to arbitrarily scale video without loss, it will look different after no matter what you do to it, and it will not be for the better. I realize some people, even most people, either won't notice or won't care, but to say that there is no issue is simply incorrect.
 
First off, screen size shouldn't be that relevant, as the tablet screen will fill a similar portion of the viewers field of vision as most HDTVs will at normal viewing distance. Give or take.

Second, I'm not sure what you mean by no artificats at all, even on quick inspection I see a number of visual discrepancies between those images. The original doesn't in fact clip on the reflection you mentioned, but the scaled one most definetly does. This is a known effect of bicubic interpolation. The gradient in the chair back is much more blotchy in the scaled version, it's not perfect in the original probably due to compression, but the scaling seems to amplify this loss. In the text I see more haloing than in the original, particularly where the right hand V of the W dips into the darker grey in front of whatever object in on the table. Again, there is some haloing in the original maybe due to lighting, sensor error and/or compression, but it's amplified by scaling. Again this is a known effect of bicubic scaling.

I could probably spend more time picking it apart, but those three things caught my eyes almost immediately on my 24" CRT. I think I have a little more than a clue, and very good eyesight. It's mathematically impossible to arbitrarily scale video without loss, it will look different after no matter what you do to it, and it will not be for the better. I realize some people, even most people, either won't notice or won't care, but to say that there is no issue is simply incorrect.

This is all I can think of...

500pxShopped.jpg
 
I looked at the images on an actual third-generation iPad, at approximately two times the ordinary scale, and the only meaningful difference I could see is that the lower image has greater scale. There is nothing else notable.

Played back at actual pixel scale on the iPad, at 264 pixels per inch, at 30 frames per second, I don't think the odds of you being able to discern between up-sampled 1920x* video and native 2048x* video in a blind test would be greater than 50%, unless the interpolation is extremely poor. It's all very good to make claims that you can see differences in a crop, on a 24" display, at ~100 ppi, at greater than two times scale, but that has absolutely no bearing on what you would see on an iPad. Nada. Zip.
 
So basically what you're saying is that the iPad display is sooooo great, that a lower quality image displayed on it will look the same as if it weren't of a lower quality. Impressive stuff.

As for you not seeing the same errors I see, the iPad display has the highest pixel density currently available. That does not make it the best display in the world. My CRT smacks the hell out of it in terms of contrast ratio and a bunch of other things that make it alot more accurate in some respects.
 
Last edited:
The original doesn't in fact clip on the reflection you mentioned, but the scaled one most definetly does. This is a known effect of bicubic interpolation. The gradient in the chair back is much more blotchy in the scaled version, it's not perfect in the original probably due to compression, but the scaling seems to amplify this loss. In the text I see more haloing than in the original, particularly where the right hand V of the W dips into the darker grey in front of whatever object in on the table. Again,

You are seriously off your rocker (or meds).
 
You are seeing things that aren't there. I just re-examined the areas you claim show all these differences at 10" from 40" LCD HDTV (a lot more detail revealing than you CRT).

And I took readings of the actual pixels in the reflection on the chair. There is no clipping. That is objective fact, so this is more BS you are spewing.

I can only conclude that you are either bonkers are lying through your teeth. You may just be deluding yourself because of your claims of all these artifacts, so now you are convinced you see them. But there isn't anything present in scaled version that isn't in the original.
 
Your LCD TV has a better contrast ratio than my recently calibrated professional grade CRT monitor?

I looked at the reflected area just now at 400% in GIMP, and there is in fact a large area of pixels of the same color present on the scaled image that is not present on the original. I'm not sure how you define clipping but that meets my definition. It's a bitmap, this isn't a matter of opinion, again, it's mathematics. The area of like pixels is much larger on the bottom one than on the top one, the original has some but this is probably due to some DNR on the source and/or compression. It's very minor. Not so on the bottom one where this area is actually several pixels wide. If they both look the same to you it's your display, or your eyes. Once again I'm not expressing my opinion here, it's right there in the map of the bits. Sorry for being so fact based. I know it pains you people.
 
So basically what you're saying is that the iPad display is sooooo great, that a lower quality image displayed on it will look the same as if it weren't of a lower quality. Impressive stuff.
Basically what I'm saying is exactly what I said. There's no need to infer anything from what I said beyond what the words convey.

As for you not seeing the same errors I see, the iPad display has the highest pixel density currently available. That does not make it the best display in the world.
I did not say it was the best display in the world. I will only say that it's the display the iPad has. If you are to argue that visual artifacts as a result of interpolation of video is a con of the iPad's 2048x1536 display, then you must argue that such artifacts are actually perceptible on the iPad's display. Do you disagree or do you not?

My CRT smacks the hell out of it in terms of...
That's terrific. Does the iPad have an equivalent CRT display?
 
Your LCD TV has a better contrast ratio than my recently calibrated professional grade CRT monitor?

Contrast ratio doesn't make up for the inherent softness of CRT. Magnification and sharpness was the point of checking it on the 40" LCD. I also didn't see what you are ranting about on my 24" calibrated NEC 2490.

I looked at the reflected area just now at 400% in GIMP, and there is in fact a large area of pixels of the same color present on the scaled image that is not present on the original. I'm not sure how you define clipping but that meets my definition. It's a bitmap, this isn't a matter of opinion, again, it's mathematics.

Clipping is maxed pixels because they went over limits. There are none present. Redefining the term won't cover up your previous BS claiming there was clipping.

FWIW I don't have an iPad, I just do a lot of video compression and scaling. You plainly don't know what you are talking about, when it comes to video scaling, and now you are doubling down on ridiculous to cover your previous nonsense.

What you are doing is the equivalent of seeing Jesus in burnt toast, because that is what you want to see (or outright lying, hard for me to tell delusions from lies).
 
Those pixels are on some kind of limit, how else would you end up with a bunch of adjacent pixels being the same color? It isn't a cartoon. Here is a quick quote from Wikipedia. I will dig deeper if you insist.

"The bicubic algorithm is frequently used for scaling images and video for display. It preserves fine detail better than the common bilinear algorithm.

However, due to the negative lobes on the kernel, it causes overshoot (haloing). This can cause clipping, and is an artifact (see also ringing artifacts), but it increases acutance (apparent sharpness), and can be desirable."

So, other people share my mental condition. Who'da thunk?

I would stand by the fact that arbitrarily scaled video will not look as good on ANY display as natively displayed video. I do not see anything special about pixel density that would change this. It will have less detail intact, and the more accurate the display the more apparent this will be. An iPad at 24" from my eyes will have the same apparent size and pixel density as a larger display further away from them. Mathematics again.

I'm not speaking of any specific displays here, because the same rules apply regardless. A smaller screen will be viewed closer to the eyes, because it is smaller, and therefore needs a higher pixel density to attain the same resolution. I'm not picking on the iPads display specifically, I've seen one and it's an awesome screen for a tablet, but the pitfalls of video scaling are well documented and do exist outside of my imagination.
 
Last edited:
Yes in theory it can cause artifacts, if it is tuned for higher sharpness, if your are doing a large upscale, and if you have very high contrast sources with sharp edges material.

In practice none of that really applies to typical tuning, on a very small upscale (like this one) on video sources.

Your condition is seeing it, even when it isn't there, because you expect to.
 
I saw it, and then to confirm it, I opened the image in an image editor, zoomed it to 400% and checked the color values of the pixels in question. Numbers tend not to lie. Look for yourself, it has nothing to do with what display it's on, or which eyes are looking at it. You understand what a bitmapped image is right?
 
I would stand by the fact that arbitrarily scaled video will not look as good on ANY display as natively displayed video.
"Not look as good" indicates that the artifacts are perceptible. You argue that the iPad should have a 1920x1440 display despite the fact that you have no idea whatever these scaling artifacts are perceptible, on the iPad's display, given normal video viewing conditions. That means video being played back at 24/25/30 frames per second, with the display 15 to 20 inches away from your eyes.

It's all well and good to zoom in by a factor of four times on a worst-case frame grab and say "look, there's a fucking discolored pixel there!", but again, that says absolutely nothing about what you will actually see on an iPad on video being played back at a normal frame rate at full scale. Nothing at all.
 
I saw it, and then to confirm it, I opened the image in an image editor, zoomed it to 400% and checked the color values of the pixels in question. Numbers tend not to lie. Look for yourself, it has nothing to do with what display it's on, or which eyes are looking at it. You understand what a bitmapped image is right?

I did that, as I reported earlier, and it only confirmed, your lies.

There was no clipping, there were no significant artifacts even at 400%.

Naturally there are differences, but there is nothing significant.

Small amounts of bicubic expansion on video source produces excellent outcomes. This tangent is complete red herring in this topic, but I will continue because scaling has been an interest of mine since I bought my first digital camera in the 1990's. I have experimented heavily in this area.

It is an ignorant falsehood to claim small bicubic expansions will create visible artifacts in video. It quite simply will not.
 
Last edited:
"Not look as good" indicates that the artifacts are perceptible. You argue that the iPad should have a 1920x1440 display despite the fact that you have no idea whatever these scaling artifacts are perceptible, on the iPad's display, given normal video viewing conditions. That means video being played back at 24/25/30 frames per second, with the display 15 to 20 inches away from your eyes.

It's all well and good to zoom in by a factor of four times on a worst-case frame grab and say "look, there's a fucking discolored pixel there!", but again, that says absolutely nothing about what you will actually see on an iPad on video being played back at a normal frame rate at full scale. Nothing at all.

Well, as I've never seen a display that didn't show the effects of this type of scaling, and as I explained above I do not see this particular display as being unique in this regard, then apparently I do have an idea that those scaling artifacts will be visible to me.

Just because video is moving doesn't make the artifacts disappear. I'm pretty sure similar artifacts are to be found on the frame before and after the one we examined above, and as the video you see is made up of this series of frames, the quality of the individual frames is the ONLY factor determining the final output quality.

There is nothing about this display that changes the well established rules. Saying I can't perceive the difference on that screen from 15" away is like saying I can't perceive the same difference on my 34" TV from say 42-48" away (about where I sit to watch a movie) ,and that I can tell you for certain that I can.

I'm running two identical streams right now, a BBC Planet Earth Bluray dump, one on my PC monitor at 2048x1152x72hz and the other on my TV at 1920x1080x60hz. Normally the PC monitor would outclass the TV by a pretty obvious margin in terms of clarity and smoothness (partly due to running at the correct refresh rate, but for lots of other reasons), but with even that tiny bit of scaling (using bicubic in XBMC) it does lose it's "edge", it doesn't look bad by any stretch, but it just doesn't look as tight. Some things are too fuzzy, some edges appear kind of false and overenhanced, and the finer motion details seem to blend into the background instead of catching the eye. Without a side by side it would be very hard to fault the 2048 picture, it looks wonderful. At first glance it might even look better, like the better you see in the bigbox stores on the TVs that are set up to look nice in the bright showroom, but are actually quite inaccurate. This may not be to that extent but the principle is the same. It may look just fine, even good, but that doesn't mean it's an accurate representation of the original.
 
Last edited:
I did that, as I reported earlier, and it only confirmed, your lies.

There was no clipping, there were no significant artifacts even at 400%.

Naturally there are differences, but there is nothing significant.

Small amounts of bicubic expansion on video source produces excellent outcomes. This tangent is complete red herring in this topic, but I will continue because scaling has been an interest of mine since I bought my first digital camera in the 1990's. I have experimented heavily in this area.

It is an ignorant falsehood to claim small bicubic expansions will create visible artifacts in video. It quite simply will not.

You're calling me a liar, without any detailed contradiction of what I stated. You fail to elaborate on your definition of clipping vs mine. You know nothing of my background in this field, yet you feel your "heavy experimentation" puts you in a position in attempt insulting my intelligence and knowledge.

You sir, meet enough of the necessary criteria for me to comfortably assume that you are in fact a douchebag, of low to mid calibre. You did not win this argument, you just ended it because you've shown yourself incapable of productive discussion.
 
Back
Top