test tt
Forum Replies Created
-
AuthorPosts
-
test tt
ParticipantAlso, the only time HDR will be “right” is when viewing HDR content. There is no consumer (or cinema as far as I know) HDR content now. So purchasing an HDR-capable display and turning HDR on for viewing SD or HD content is going to be as “wrong” as using Dynamic mode or any of the other settings video displays have that marketing departments demand, but that do nothing but make images more inaccurate and unnatural. Ditto for wide color gamut… without wide color gamut sources, applying a wide color gamut feature to SD or HD content will just make images look less accurate because the TV can only GUESS at what color was intended in the original content.
Maybe someone from TXH can comment on HDR for cinema… today’s illumination technology and cinema screen sizes make it difficult to get more than 16 fL white on a cinema screen. How the heck will cinema projectors get to the point they can produce 160 or more fL in order to be able to support HDR? Are we getting to a point where home video is going to outperform cinema in this regard? Or will cinema get rid of projectors and switch to giant-flat-panel displays in order to move into “HDR” imaging? It is difficult to imagine seeing 160 fL white on a screen the size of a cinema screen! The theater will be flooded with light.test tt
ParticipantYou will find that all up-conversion from 1080p to 3820p is NOT created equal. The Sony projectors and Oppo disc players DO NOT up-convert to 3820p as well as Lumagen’s Radiance processors that have UHD support. AVRs and other devices that offer UHD up-conversion may not even be as good at it as the Sony and Oppo products. If you want to see 1080p at its very best, turn off UHD up-conversion in all other devices and use a Lumagen Radiance UHD processor to do the up-conversion. It takes the UHD experience at least 1 level higher. AND… the Lumagen UHD Radiance processors also include Darbee Visual Presence processing. Add THAT on top of Lumagen’s superior 3820p up-conversion raises the final results yet another level. 1080p images get so good you wonder if you really need native 3820p sources. And right now, lacking native UHD sources (other than Sony’s movie offering), UHD video displays are all about how much magic they can inject into 1080p sources.
test tt
ParticipantYES — MAYBE. There is a phenomenon that becomes increasingly likely to happen as the bandwidth of the light source decreases. Projection lamps, phosphors, CCFLs, and “regular” LEDs including “regular” OLEDs all produce relatively wide-bandwidth light. Blue light is made up of a wide range of frequencies, as are green and red. In many cases (probably most cases), blue and green wavelengths overlap each other and green and red frequencies also overlap each other.
A laser light source would be the ultimate “opposite” of a conventional light source… there is ONE frequency of blue light, ONE frequency of Green light and ONE frequency of Red light. All colors are derived from combinations of these single frequencies. Cyan light will be comprised of 2 frequencies as would yellow and magenta light. All other colors would be composed of 3 frequencies.
Quantum Dot technology is a reduced-bandwidth light source. How much reduced the bandwidth is varies quite a bit (so far), but suffice it to say QD displays are much narrower bandwidth (in regards to light source spectrum) than conventional displays, but considerably wider bandwidth than laser light sources.
Why is this an issue? Because of something called metamerism. As the bandwidth gets smaller and smaller, there is more and more variation in what different people see on the screen. The meter may measure a specific shade of yellow as being perfectly accurate, but people may report that the yellow is muted, or too green, or too red, or too dark or too light (along with a perceived color shift). You ask 100 people what color they see and you can get 80 different answers even though they could all be viewing the display at the same time in the same viewing conditions. You cannot calibrate around this issue. It is something that simply exists with no “workaround”. If you produce a PERFECT calibration on a narrow-bandwidth video display, everybody who ever sees it might see something different. That doesn’t mean your calibration is bad, but it is pretty difficult to defend calibration knowing the viewers may disagree about what they see.
For calibrators, if this turns out to be a significant issue, people may decide to never bother with calibration if everybody sees something different AFTER calibration. Everybody sees something different before calibration so there’s (perhaps) no improvement in image accuracy after calibration because everybody will STILL see something different.
This is a PERCEPTUAL issue with how human vision works when presented with “unnatural” (i.e. narrow bandwidth) light sources that don’t exist in nature. Our eyes adapted to “white” light produced by the sun. That’s a very wide bandwidth light source. The more you reduce the bandwidth, the more perceptual errors you get. And these aren’t “predictable” perceptual errors. People don’t all see the same error(s) in a narrow-bandwidth color of yellow, you get a ‘scattershot’ effect when you have people select the reference color of yellow that they see from the narrow-bandwidth source. Like shooting at a target with a shotgun… more “hits” will be towards the center of the pattern, but the pattern itself can have quite a wide spread. So how do you deal with an owner of a QD display (or laser display or some sort) who sees a less accurate (to them) video display after calibration versus before calibration? You can’t see what the customer sees and your meter cannot be “corrected” to see what the owner sees either. All a calibrator can do is make the video display accurate… but if the owner sees less accurate images when the display is accurate, how has a calibrator helped anything? There is nothing the owner can do to change what they see. As calibrators, we may question our work because WE may not see accurate images when our meter says they are accurate. This is not an optical illusion, it is something real that’s part of the human vision system.
I haven’t seen “anybody” in the calibration “industry” weigh-in on this issue yet. Nor have I seen anything that confirms metamerism is an issue with QD or laser displays. So this is something more than speculation but less than accepted reality for our video calibration efforts. There is a POTENTIAL big problem here but I haven’t seen anything that confirms it yet.
test tt
ParticipantWhat exactly does “provided by JVC” mean in this case? This calibration setup did not come with an RS67/6710 and was not offered or mentioned by JVC when I reviewed the projector last year. Spyder meters have a bad reputation for color accuracy, though they did the “light”/gamma thing OK in the past. It has never been easy to get gamma right on JVC projectors over the last 5 years or so. Lumagen Radiance processors have been the best solution so far. Otherwise you are in for an endless session of messing with gamma sliders in the JVC projectors.
test tt
ParticipantNEVER do a reset in the service menu of ANY video display unless you know EXACTLY, and I mean you must POSITIVELY KNOW, what will happen when you do that. Some Service Menu resets will return a display to a state where there are NO settings applied anywhere and that will completely disable the projector with the only way to recover being to have it sent back to the factory. A customer asking for a service menu reset is a bit strange. The first thing it makes me think of is that the customer went into the service menu and changed settings and ruined something and is now going to leave it to YOU to do something he didn’t want to try on his own. That way, if something goes wrong, he can blame you for it. The only resets I would consider unless I got the information from a factory technician or engineer is a reset in the user menu. Most things in most service menus are NOT easy to figure out. And MOST “modern” displays have nothing useful in their service menus anyway. I’ve not had any reason to use the service menu in any JVC projector in the last 5 years or so. I’m not sure why your customer thinks a service menu reset needs to be done unless he did something bad in there and cannot figure out how to “un-do” what he did.
test tt
ParticipantBTW – for those interested, the difference between the RS67 and RS6710 is that the 6710 is priced $500 higher and has a 5 year warranty and a spare projection lamp. The RS67 is $11,995 and has a 3 year warranty and no spare lamp. Projectors are otherwise identical. Only one distributor carries the 6710 and 4910 (different extras on the 4910 than the 6710), all other distributors carry the 67 and 49. You should be able to get either one in the US but any given store may have access to only one of the 2 models.
test tt
Participant???? Why is the backlight called blue-green when there is clearly red light (as there would have to be) in the emission spectrum? Are they creating red light with the green or blue (or both) LEDs by flickering them into some other mode that causes them to change color like multi-color LEDs that can change to any primary or complimentary color?
test tt
ParticipantI find that Sony’s 1000ES 4K projector produces the best looking 1080p images I’ve ever seen. There is NO softness whatsoever. In fact, edges are cleaner and clearer than I’ve seen with any 1080p display. There is no more visible detail, but the detail present in the HD images is just about perfection. I haven’t seen a 4K panel yet. There is certainly room for poor quality upconversion of HD to UHD. Sony’s projector uses anti-aliasing to reduce aliasing you didn’t even realize was there until you see images without it. That is going to be one of, if not THE, key to making HD images look great on UHD displays. Lumagen’s new 204X processors employ their No-Ring patented upconversion technology to convert HD to UHD to make images cleaner. People are mistaking the ringing you get from more careless upconversion (like that in the current Oppo disc players, surprisingly) for sharper images. Upon getting comments from customers, that their Oppo disc players produce sharper-looking images than their Radiance 204X processor, Lumagen investigated and as it turned out, the Oppo images contain ringing that acts a lot like a Sharpness control that’s turned up too high and produces edges that are not present in the images. The Radiance 204X images had no ringing along edges, so looked less sharp to the Oppo owner.
If you think about it, HD images CAN’T HELP look better on a 4K display that is doing a good job of upconversion. Consider a 45 degree edge or line. The HD pixels form a staircase… unavoidably. If you move the HD image to a UHD display, the display can remove the “tip” of each stair-step, and also fill-in the inside corner of each stair step making that diagonal line MUCH smoother than it could be displayed on the same size HD display. That, of course, assumes that there’s some intelligence in the upconversion. If the upconversion simply makes 4 pixels out of each individual pixel, the UHD version of the HD images should look essentially identical since making 4 pixels out of each 1 pixel in the HD image means the pixels in the 4K display will be the same size as the pixels in the HD display so there should be essentially no difference between HD images on an HD display vs very simple upconversion to UHD. The more complex upconversion Sony is doing (and presumably at least as complex as Sony’s, if not more-so, Lumagen’s No-Ring upconversion) really makes HD images look fantastic. I was lucky enough to live with a 1000ES for about 3 months and was still amazed at the end of the 3 months about how great HD images looked.
xvYCC is fully defined… completely and totally defined and identified. But the only xvYCC sources are a few digital video cameras that have to be placed in that expanded color space mode in order to create images with the larger gamut of xvYCC. But xvYCC is a kludge that has no place in a new video standard that can be accomplished WITHOUT all the strangeness (like negative coordinate values) required to make xvYCC’s larger gamut work. You can turn xvYCC on in any video display but NOTHING will happen because you aren’t likely to be feeding that video display anything but Rec 709 video. The video stream itself has to be xvYCC in order for the video display to use the larger gamut. If the video stream is simply conventional HD video, you only see conventional Rec 709 color space.
Current 4K displays get past the limitations of HDMI 1.3/1.4 by limiting their 4K-ability to 24p or 30p. To do 60p at 4K, you need a whole new transmission standard that will require new cables, new HDMI senders and new HDMI receivers. So far HDMI 2.0 is looking like a lame duck even before it gets adopted because “they” want something compatible with existing “high speed” HDMI cables and that’s not looking like it will be possible with 4k60p especially with 10 or 12 bits. And some would argue that you want even more bandwidth for 4K 3D so you can get the refresh rate up to 144 Hz where it seems possible to enjoy 3D with zero eyestrain and zero flicker.
test tt
ParticipantThere’s no denying the Darbee processing makes 2D images look like they have a better sense of depth, and it does make detail more obvious (the increase in contrast at work). I have experienced handshake problems with older HDMI components, but it seems fine with newer HDMI components. Also, I don’t think it will even pass 3D unless there has been some kind of update to the original model I have. But there’s no USB or serial port for updating so I’m not sure how you would update it at home… does not appear to be update-able to me. The photos they run in ads for the Darbee processor are “fair” IMO… what they show in the with/without split screen fairly well mimics what I see. I find it easier to over-do the Darbee processing than to keep it from being way too obvious. When you use the Full Pop setting, for example and crank up to a high value, I’ve seen it produce the same artifacts as too much sharpening… edges appear around things that did not have edges. A shot with an older building with a lot of identical-size large windows was a good example… cranked up, every window had multiple sharpening-like artifacts around it with Full Pop mode turned up fairly high. Switch to HD mode and drop down to 45% or so, and the artifacts disappear and there’s no obvious artifacts anywhere in images, but images do appear to have more depth and detail stands out a bit better.
test tt
ParticipantYou’re not forced to use the Darbee processing in the new Radiance processor(s), you can use it or not use it. There have been recent (unsubstantiated, but possibly true) claims that Darbee processing has been used by (pick 1 or more) directors / cinematographers / mastering houses during preparation of 1 or more Blu-ray releases. I don’t have a problem with that if the images look like the director/cinematographer want them to look after the processing is applied. But if the Blu-ray release was produced with Darbee processing, you would NOT want to Double Darbee the disc by using more Darbee processing at home. How do you know what disc titles have already been Darbee’d? I’ve never seen anything listed on a disc package (so far, but I don’t see even half the Blu-ray discs that are released) that would alert you to the disc already being Darbee’d. Unless there’s some metadata in a Darbee’d release that would temporarily disable Darbee processing at home, it would be difficult to avoid the Double Darbee problem. I find it difficult to watch cable/satellite programming without Darbee processing now… it helps a LOT as long as you don’t use more than 45% and stick with the HD user mode. For Blu-ray, it can often be set to 65% without seeing any problems in the images but I can’t stop thinking that the director/cinematographer didn’t see their movie this way and I get antsy to turn it off. In the Lumagen processor(s) that have Darbee processing, you can turn the processing on for all inputs or only for selected inputs so if you choose to stay a Blu-ray “purist” you can still do that very easily. Many times I have the Darbee box connected upstream of the Radiance processor connected only to the cable/satellite box where it does the most good and doesn’t bother me psychologically at all since everything is a bit off anyway and since cable/satellite programming doesn’t look much like real/full HD anyway.
test tt
ParticipantThe most visible problem I saw after using Epson’s multi-point alignment (I forget how many points can be adjusted… more than 128 if I recall correctly) was that after aligning the entire screen to be perfect, when you displayed anything full-screen that wasn’t too dark (say from 15% or 20% white and brighter), an oval in the center of the screen had a visible green tint and the areas under and above the oval had a magenta tint. That wasn’t there before using the multi-point alignment. Full-screen single pixel black/while alternating line patterns did not seem to suffer from using the multi-point alignment (no apparent localized moire or loss of resolution), but it could certainly happen… perhaps I didn’t have to make moves large enough for it to become apparent. The green/magenta discoloration was fairly distracting and not difficult to see, yet I wouldn’t call it huge either. The problem I had with the 6020 was that the factory alignment was imperfect enough that at my normal viewing distance you could see red fringes from the alignment not being perfect fairly easy. There were also some visible blue misalignments but those were harder to see from the main seat than the red misalignments (not surprising). I didn’t think Epson’s optical system was very good either. No matter what you do, the images have a soft look to them… not de-focused as much as being softened by internal reflections in the optical system (including the lens).
test tt
ParticipantIt appears that 70″ Elite production units dated November 2012 may all have a very peculiar uniformity problem…. divide the panel into 3 equal-size vertical bars, then split the screen down the center so there are 6 equal-size rectangles… 3 of the rectangles will be off-color compared to the others… green-tints and magenta-tints seem to dominate the problem. The boundaries of the discolored areas are very sharp, not like “clouds”, more like “hard” edges. 3 units unboxed and tested all showed the same issue and all had the same manufacture date.
test tt
ParticipantI’ve calibrated three, maybe 4 displays over the years where the owner had an i1Pro and was either using it for calibration or was using it to profile a colorimeter. In each of those cases, the owner wasn’t happy with their calibration and wanted to see if what they were doing was procedural or software or meter related. When I calibrated with a Konica-Minolta CS-200 (a hybrid meter… some characteristics of a colorimeter, some characteristics of a spectro meter), the result was perfect, visually. My measurements of their calibrations easily showed the problems they were seeing, while their own re-measurement of the displays looked good with their software. So the problem was clearly meter-related since they weren’t doing anything but taking measurements and comparing those to my measurements. Two of them were pretty pissed-off that they had spent somewhere over $1100 for meters that would not calibrate their displays correctly. At the time, I didn’t think to run a spectral distribution of the light emitted by those displays to see if there was anything unusual, but my guess would be that the pro monitors mentioned in this thread have a backlight source that has something different/unusual about the spectral distribution of the light being measured… for example there could be too many long wavelengths of light (towards infrared) or too few… if the meter measures those wavelengths, it affects the readings but the long wavelengths are invisible so you would have a condition where the display measures right but seems to be lacking red when the meter is too sensitive to infrared (or if the display emits too much infrared and the meter is “normally” sensitive to infrared. Another possibility is that the meter does not see the longer red wavelengths, even though those wavelengths haven’t drifted into the “invisible zone” quite yet (as an example, CD players use infrared lasers, but the wavelength of light produced by the laser is short enough that it is still visible to us so it’s “visible infrared” rather than infrared being invisible as we expect. Same sort of thing can happen at the blue end of the spectrum where, when wavelengths get short enough, they go invisible and become ultraviolet… there’s a transition zone there also where the wavelengths are short enough to be considered ultraviolet, but the are still visible. My guess is that if you use a high-end meter to measure that pro display’s light spectrum, you’ll find it has either much more or much less light in the longest and/or shortest visible wavelengths. ANd when you measure consumer displays of the same type to see their spectral distribution (how the wavelengths are distrubuted), you’d get a very different result. And I’d also guess that somewhere in the specs of the i1Pro, there’s some sort of spec that define’s the meter’s accuracy in terms of the wavelengths included and excluded from that accuracy statement. Somewhere around 450 nm and lower, you are in ultraviolet, and somewhere around 940 nm you’re into infrared. The better meters used for video display calibration will mimic human visual sensitivity and not respond to the shorter wavelengths nor the longer ones while less expensive meters… who knows?
test tt
ParticipantWhy ControlCal? Does the direct connection (DDC) option in CalMAN not work?
test tt
ParticipantFirst, a colorimeter will calibrate 3D fine as long as the viewing angle of the meter is not so wide that you can’t get accurate readings through the lens of 3D glasses. If the colorimeter will calibrate 2D well for the display you are working on, it will calibrate 3D equally well. But if the colorimeter has problems with the display tech you are calibrating in 2D mode, it will be just as bad in 3D mode. Some colorimeters have problems with LED light sources… so those won’t work well with LED/LCD in 2D or 3D modes, for example. If the colorimeter has a huge angle of view (picks up light to the sides), it’s going to be difficult to work with for 3D calibration since you only want the profile to correct for light passing through the 3D glasses.
Spectracal says profiling works for 3D calibration — but I haven’t had time to dig into this to determine if my 2D profile would be good or whether I would need to put the TV in 3D mode, calibrate that to d65 and measure that for my reference for the profile.
-
AuthorPosts