What is SED?

ar2005 said:
will sed give radiation like crts?
will sed have sharp text/graphics like lcds?
when will they release the first sed and how much is it going to cost initially?

Radiation should still be there cause you still got electrons emitted. Dunno about sharpness. It might be as sharp as LCD since you got more electron beams now instead of one. But then they might get scattered around like in CRT and make things blurry. We'll see.
According to one of the articles, 1st monitors should appear later in 2006, maybe 2007. But to me that sounds iffy, they haven't even released a mass-production TV to the stores yet.

__________

I agree with Hurin. Look how much bump-mapping and pixel shading have done. Even on 1024x786 they make a huge difference. Or water effects. Sometimes in Far Cry I am still amazed and just look at the water for a while cause it's so pretty.
Now we got HDR. Resolution isn't everything, and at 1600 or 2048 it should be way enough and instead focus should be on effects.

I think focus also will be on polygon count as no bump-mapping or other simulations of 3D can be as true as a real 3D model.
 
XxDaRkReAp3rxXdOtCoM said:
OLED > all :D
Agreed, but only in the small and portable category (cellphones, PDA's, notebooks etc.). SED's will rule in the (stationary) desktop arena if everything turns out to be as good as promised. :cool: Personally I'm looking forward to see what OLED will offer in terms of battery saving power...it's about damn time someone did something to address energy-wasting displays. :)
 
About the radiation, you're not going to have very much. With CRT, you're firing a very strong electron beam at a very small target (a pixel) for a laughably short time in terms of duty cycle.

Think of it like a laser light show. If you have a five milliwatt laser pen, it's going to put a bright spot on the wall, but isn't going to do much if you're making a raster on a 30 foot dome by scanning it. Use a 3 watt laser like the older laser show systems commonly used and you'll have a show that covers a 30 foot dome, but if you leave it in one place, not only will you have a hole burning into your surface, but you will have scatter radiation likely to mess things up as well.

To put it sloppily, to get the same luminance out of an SED display at 1600x1200 with 6.3 million pixel segments, you only really need 1/6.3millionth the amount of energy for each emitter. But you need 6.3 million emitters.

Other stuff comes into play, though. The figure above talks about the energy hitting the phosphors. The beam has more power because of your aperture grille or shadow mask blocking part of it.

Alpha particles can't get through paper (or your skin)
beta particles can't get through aluminum foil (or a really generous coating of zinc oxide)
gamma particles (and X-rays. only difference is the source) can't get through lead

The front glass of a CRT is partly lead crystal to block X-rays that may be emitted under extremely remote circumstances. The electron beam is on the edge of making waves with this energy. SED doesn't really present this problem. You're not likely to need much more than alpha level radiation.

The point (finally, after a 24oz Rockstar): You're not going to have much radiation leaking out of those phosphors because: 1. the emitters don't have to throw as long of a distance. 2. don't need that massive punch CRTs require 3. spread the charge out over a very large area, emitting persistently at very low levels rather than being kicked in the nuts at a .00005% duty cycle
 
1600x1200, I admit, is not enough for some. But for most, it's still that "holy grail" that's still trying to be met.
 
Hurin said:
Realism isn't going to come from more resolution, it'll come from more realistic 3D effects and engines. We're finally at the point where NVIDIA and ATi are concentrating on realism over resolution because they've finally reached the point (around 1600x1200) where they have the resolution they need to make things look truly realistic via other additional means.

I've had a monitor capable of 2048x1536 for years. My eyes detect very little difference between that and 1600x1200 with all other things remaining constant. But I sure noticed the DX9 effects in COD2! Those made a difference, and things remained playable at 1600x1200 with everything turned on. At 2048x1536, it didn't look any better, but became a slide-show.

I could see the need for more resolution if displays suddenly grew in size. But at 1600x1200 (which, I too have been using for years), we've reached a sweet spot where it's time to start concentrating on other visual shortcomings.

H
I'll just say I disagree completely. I can certainly see the pixels on my 20" 1600x1200 lcd (screen door effect). The smaller the pixels, the more smooth and lifelike you can make the image, surely you can understand this. What I'm saying has nothing to do with gaming, 3D technology or anything like that, just the image quality.
 
Well, yeah, of course we all agree that higher res is always better, but I think the quality added by the afore-mentioned effects adds more than just running stuff at higher res. I mean imagine these games w/o pixel shaders, bump mapping, dynamic lighting and other stuff but having hi-res. They wouldn't be as attractive.
 
M'ichal said:
Radiation should still be there cause you still got electrons emitted. Dunno about sharpness. It might be as sharp as LCD since you got more electron beams now instead of one. But then they might get scattered around like in CRT and make things blurry. We'll see.
As MisterDNA points out, the generation of electrons is much more controlled and local. Radiation should be negligible, and if I'd have to guess, I'd say bleed-over would be minimal.

I'd like to see more resolution so the pixels can finally "vanish". Sheesh, resolutions have hardly increased since LCDs were introduced - the development went to a halt where it had been going at a good pace with CRTs. 3D is just one aspect, how about really smooth text?
 
Unknown-One said:
SED is looking like its going to be awesome, but I'm surprised that the simplest fix for LCD contrast ratio's hasn’t been applied yet; Layering LCD panels one on top of another, and using a brighter backlight (or LED array backlight); the second layer of pixels could even be grayscale only as color isn’t needed for a brightness reduction layer.

You would get darker blacks due to two layers of pixels blocking light, and brighter whites due to the allowance of a brighter backlight, and using the option of an LED array for a backlight would allow an even greater contrast ratio due to being able to turn areas of the screens backlight completely off.
.

and I have test out something similar, aka Xbright from Sony. You can't see anything except yourself reflection on the screen under regular lamp light. And if it is under indirect sunlight, well, forget about working.
 
cost would be a major issue in the stacking, as would using LED's, it could work but it's not really feasable for mass-production.
 
kleox64 said:
cost would be a major issue in the stacking, as would using LED's, it could work but it's not really feasable for mass-production.
I got that stacking would be an issue, but what makes useing LED's as a backlight hard?
 
Maybe not hard, but pricey. There are already LED-lit LCDs out there, Tom's Hardware had an article a few months ago about a NEC one. 19" was over a grand if I remember correctly.
 
Yeah. I don't think displays with LED backlights are fundamentally hard or expensive to produce, it's just that they'd have to change their manufacturing process - which requires a sizable investment. This investment into LCDs would be hard to justify when newer technologies like SED, NED and OLED are looming.
 
Araanor: said:
Yeah. I don't think displays with LED backlights are fundamentally hard or expensive to produce, it's just that they'd have to change their manufacturing process - which requires a sizable investment. This investment into LCDs would be hard to justify when newer technologies like SED, NED and OLED are looming.
NED? Links?
 
haha, yeah, exactly - NED? Looks like every week imna have to learn about new display technology :p
 
Araanor said:
Yeah. I don't think displays with LED backlights are fundamentally hard or expensive to produce, it's just that they'd have to change their manufacturing process - which requires a sizable investment. This investment into LCDs would be hard to justify when newer technologies like SED, NED and OLED are looming.
The Sony TX series has LED LCD's, granted it's a notebook but still... :)
 
this HDR seems to improve the actual image, drastically.
But then it kinda sux that it's still an LCD, so the stupid viewing angle issues will be back. And how about blurring???
The HDR referred to here isn't a whole new display technology, its just a new method of backlighting. So yes, all of LCD's other problems will still be here.

Comparing costs, SED should be slightly cheaper in theory to manufacture than LCDS...once you've reached economies of scale and amortized research costs. OTOH, HDR will be roughly twice as expensive as standard LCDs.

The article linked to on HDR is a bit misleading. They cannot have 'infinite' contrast ratio, because of interpixel bleeding. The claimed figure of "200,000:1" is just a claim, and would depend on how well manufacturing can reduce bleeding. In this regard, SED and HDR are equal....but SED wins out big on viewing angles, response time, and color rendition.
 
There are already LED-lit LCDs out there...
Oops, he's talking about LED **array** lit LCDs. HDR, in other words.

SED doesn't really present this problem. You're not likely to need much more than alpha level radiation....
Err, neither SEDs nor old-school CRTs release alpha radiation. None.

Beta only, and a very tiny amount of gamma. Nothing dangerous...you likely get far more just from walking around your home.
 
Uhh...

Behold...

Current OLED technology. It's only sold through wal-mart here in the USA

DAH-1500i_Blue_500.jpg

http://www.mobibluamerica.com/dah1500.html#

130 bucks for a removable mass storage device that's 1 gigabyte, isn't an ipod, and is less than a cubic inch all around. Not to mention, you don't install ANY software or drivers, XP/2000 sees it as a removablel storage device.

Cool huh? :cool:

It will store any file type, but will only play MP3, WMA and the digital rights management crapola. Other files the mobiblu will just ignore until you dump it onto your HDD. So , TAKE THAT iPooP! :p

Few draw backs... the headphone jack is where the USB cable plugs into :rolleyes:
No charging while listening @ work.
Also, currently in question, is wether or not you can replace the battery once it goes beyond full recharge ability. The current is 8 hours of play time. It's too small of playtime, says some people. While some say it's physically too small to operate well. Well it was designed by the Japanese. Asians aren't exactly known for being the largest race on the planet.

All in all it's a cool device.
It's an Organic Light Emitting Diode and it's here and now, not years away. Just thought I would bring that up.
 
With SED, each phosphour has its own electron gun. So, it could conceivably be firing at all times. Hence, no flicker...

I seriously doubt SEDs will be built with latching electron guns, so you'll still have to refresh the screen. Unlike a conventional CRT, though, you'll be able to strobe a column and update an entire row at once, so the "flicker" effect should still be considerably reduced.

But I guess since each pixel will have its own emitter, the problem of resolution scaling will still be present...
Except that those emitters can be made a lot smaller than the current LCD pixels. SEDs should be easily manufacturable at 300-400 dpi. That works out to roughly a 6400x4800 display on a 20" monitor.
 
Holy shit, 6400x4800 on a 20" monitor? SED probably won't be that bad as I thought... Man, I can't wait until they get released :D
 
SED will definitely be a better display than these HDR monitors. Still, don't misunderstand me....I doubt the first ones off the line will be at 6400x4800. They'll get there fast though. Its a technology far easier to shrink than active-matrix LCD.
 
masher said:
I seriously doubt SEDs will be built with latching electron guns, so you'll still have to refresh the screen. Unlike a conventional CRT, though, you'll be able to strobe a column and update an entire row at once, so the "flicker" effect should still be considerably reduced.

Latching? What?
http://www.behardware.com/articles/593-1/close-encounters-of-the-third-kind-sed.html
"In fact, SED seems to be the natural son of TFT and CRT monitors. It combines the thinness of the first and the qualities of the second and improves them. Like cathode-ray tube TVs, SED technology is based on the collision of electrons and phosphoric monitor to emit light. Still, unlike cathode-ray tubes, there isn’t a single gun for the monitor, but a mini electron gun behind each sub-pixel! 1920 x 1080 x 3 = 6.2 million of guns."

Due to the nature of how SED works, it would only be logical to have all the pixels turned on simultaneously. Which is conclusive about there being no 'refresh rate'. Now whether there will be a rising and falling time due to changing the voltage on the phosphorous to change color is a different question entirely.
 
NED is a SED-similar technology. There hasn't been much buzz about it yet, so presumably it is further off than SED.

http://nanotechweb.org/articles/news/4/5/11/1
http://www.motorola.com/mediacenter/news/detail/0,,5484_5474_23,00.html

UnknownSouljer said:
Now whether there will be a rising and falling time due to changing the voltage on the phosphorous to change color is a different question entirely.
Delay should be negligible in regards to modern phosphors. In fact, SED is usually cited as having sub-1ms.
 
UnknownSouljer said:
Latching? What? Due to the nature of how SED works, it would only be logical to have all the pixels turned on simultaneously. Which is conclusive about there being no 'refresh rate'...
You don't understand. Matrix addressable elements cannot all be simultaneously accessed. You can address the elements of any single line at once, but to address the entire screen, you have to strobe through the array line by line, firing the pixels in each line consecutively. Make sense so far?

Now, once you move to a new row, the pixels in the previous row are now all "off" and will start to fade. Unless, of course, you build a bistable latching mechanism into each pixel, allowing it to "remember" its on/off state.

The first LCD screens were built with a single transistor at each pixel. These were "passive matrix" displays, and were nonlatching. Due to fading between row strobe refreshes, they were much dimmer than their replacement, "active matrix" displays. These had two transistors at each pixel, allowing them to latch state information. Still with me?

For reasons which I won't explain here but should be obvious with a little thought, consumer SED displays will almost certainly be built with a passive design, for at least the first two generations. So they will need to overdrive each pixel and continually refresh the screen. However, since they can address an entire line of pixels at once (instead of the single pixel at once current CRTs can) the amount of 'flicker' should be several orders of magnitude less.
 
didnt Toms Hardware Guide or someplace already review an SED display.. that turned out to be just as shitty as a LCD only with a better contrast ratio
 
masher said:
For reasons which I won't explain here but should be obvious with a little thought, consumer SED displays will almost certainly be built with a passive design, for at least the first two generations. So they will need to overdrive each pixel and continually refresh the screen. However, since they can address an entire line of pixels at once (instead of the single pixel at once current CRTs can) the amount of 'flicker' should be several orders of magnitude less.
It's cheaper, I would assume. But I don't see why this should make it so obvious.
 
Big Fat Duck: said:
didnt Toms Hardware Guide or someplace already review an SED display.. that turned out to be just as shitty as a LCD only with a better contrast ratio
Do you have a link to the review?
 
masher said:
Oops, he's talking about LED **array** lit LCDs. HDR, in other words.


Err, neither SEDs nor old-school CRTs release alpha radiation. None.

Beta only, and a very tiny amount of gamma. Nothing dangerous...you likely get far more just from walking around your home.

What? Well then why would they have low radiation monitor then? In an old tv show called computer chronicle, they said a regular radiation CRT monitor causes cancer or eye cataract.
 
It's cheaper, I would assume. But I don't see why this should make it so obvious..
Consider. CRTs update one pixel at a time, yet-- at high refresh rates-- their flicker is indistiguishable to most people. An SED display can update an entire row, say 2000 at once. So the same scanning rate will generate 1/2000 the flicker. Totally invisible.

Consider also that the first LCDs were passive, even though the image quality differences were, in this case, quite large. In this case, the difference between the two was only one extra transistor at each pixel. In the case of SED, though, you'd need two extra, meaning the price differential would be even larger.

Consider still further. A (nonlatching) SED display requires *no* transistor layer at all. To add flipflops to each pixel, you need to add a whole new layer. Now you can no longer screen-print the surface emitters...you need to use semiconductor fabrication techniques. The price differential rises more.

Consider finally. That transistor layer requires physical space. That increases the minimum size of each pixel (limiting maximum resolution) and may increase interpixel gap. It could well be that a latching SED display would have a *lower* overall image quality than a nonlatching one, even if it did eliminate flicker entirely.
 
What? Well then why would they have low radiation monitor then?
For the same reason people visit palm readers, buy useless homeopathic "medicines", or believe in the Loch Ness Monster. People believe what they want to believe, and radiation is a nice, scary buzzword.

In an old tv show called computer chronicle, they said a regular radiation CRT monitor causes cancer or eye cataract.
It does...if you believe in a linear exposure model for radiation, and ignore inconvenient facts like background radiation levels, or that people living in naturally high-radiation areas (any Rocky Mountain state, for instance) already get thousands of times the ionizing radiation that you'd receive from any monitor. Even in a low-radiation area, you're getting several hundred times as much.

Go to any beach in the world, and you'll see people paying money to intentionally sit for hours under a massive nuclear reactor...the sun. Or visit any mineral spring spa, where you'll find people paying even more money to sit in naturally-radioactive water. And even buying it to drink!

The granite used to build NYC's Grand Central Station makes the interior more radioactive than NRC regulations allow for nuclear reactors. 120 mrems/year worth. Yet people use the station daily for their entire lives and survive just fine.

Using a monitor daily and you'll get roughly 1 mrem/year extra. Sleep next to another person nightly and you'll get 2 mrem/year extra...just from the radioactive potassium in their body. Should we ban sex and marriage then as a health hazard?

Believe what you want. If people can buy into Scientology and Faith Healing, they can certainly have no problem believing monitors are killing them.
 
I'll take your word for it.
Don't take my word for it...think about it and come to your own conclusions. You might stumble across an angle I missed.
 
Back
Top