# [VC] ASUS announces Swift PG27UQ 4K IPS 144Hz G-Sync HDR monitor



## juano

Quote:


> ASUS Republic of Gamers (ROG) today announced Swift PG27UQ, a 27-inch *G-SYNC HDR* gaming monitor that is the very first to offer *4K UHD (3840 x 2160)* gaming visuals at an ultra-fast *144Hz* refresh rate for the gaming experience with incredible contrast, deep saturated colors, and stunning brightness.


Source

EDIT: More information from Asus including a Q3 release window and price of $1199.
EDIT 2: Asus removed the $1,199 price from their article.


----------



## Sempre

Finally. Wish it was 32" though.


----------



## Seyumi

Would have bought this if it wasn't 27". That's just too small for 4k. Those 34" Ultrawides with lower resolutions are more appealing than this. Don't think I can go from 40" to 27" even if I jump from 60hz to 144hz. If this was at least 32" I would have bought on day 1.


----------



## mtcn77

I enjoy the Magic Color(gamma) and Dynamic Contrast extended features of my monitor very much.
I'm sure they will be coveted the same in the eyes of the users. You just cannot compare the reliability of normal LCDs to other technologies however much we wish to improve upon them.
The joke is not on this monitor - it fulfills the HDR10 specification to the letter. It is actually a ploy on Nvidia users who thought they would employ the full Dolby Vision (12-bit, 10,000 contrast) standard with their would be superior hardware. The standards are established when oems establish them, not the moment your virtual support level says so.


----------



## Iching

What's up with a thick bezel? That resolution is way too small for a 27". The design isn't bad overall but I still prefer PG279Q looks.


----------



## Angry-Hermit

Is the size thing just a preference. Same resolution, smaller screen, smaller pixels, higher DPI... is this a good that as long as its not too small for where your sitting? or is it just the fact that on a 36 you wont notice a quality loss and your get a bigger screen? I was measuring a 34 today and that was just huge for me.


----------



## TheCautiousOne

Quote:


> Originally Posted by *Angry-Hermit*
> 
> Is the size thing just a preference. Same resolution, smaller screen, smaller pixels, higher DPI... is this a good that as long as its not too small for where your sitting? or is it just the fact that on a 36 you wont notice a quality loss and your get a bigger screen? I was measuring a 34 today and that was just huge for me.


I use a 32" Crossover 324k and I love, Love, the Size.



32" Is on the Left, 27" 1440p Qnix is on the Right.

I've had 24" (Typing on it at work to you right now)

Used the 27" for a Good 2 years, and the 32" for over 6 months at this point.

TCO


----------



## jbmayes2000

Guru3d posted this article but then retracted it. Couldn't find it again until this one. This looks interesting. I guess the new cards need HDMI2.1, correct?


----------



## Wildcard36qs

I agree with others that the size is too small. 4K should be at least 32". The price is going to be outrageous. Sucks that my 65" HDR TV is cheaper than a 27" monitor...


----------



## GorillaSceptre

Finally the HDR panels are releasing, I'm jumping on the ultra-wide bandwagon myself, but 4K 144 is mental.


----------



## Xuvial

It'll take nothing short of 1070 SLI (minimum) to make the most out of this monitor.


----------



## un1b4ll

Well, crap. Time to fork out some dough.


----------



## Yvese

All I gotta say is if you pay anywhere near $1k for this you're crazy..

I bought my 65KS8000 for $1079. To pay near that price for a 27" monitor is madness. $600 is where I would draw the line, and even then that would be for 32" not 27.


----------



## Dagamus NM

These are somewhat interesting. I don't know about the 27", but my Asus 28" 4K60P monitors are great. Granted, I see how this size would annoy many users based on their vision, I am ok with it.

I know a single Titan XP would not be able to consistently drive this thing, maybe next arch's flagship will.


----------



## jbmayes2000

Quote:


> Originally Posted by *Yvese*
> 
> All I gotta say is if you pay anywhere near $1k for this you're crazy..
> 
> I bought my 65KS8000 for $1079. To pay near that price for a 27" monitor is madness. $600 is where I would draw the line, and even then that would be for 32" not 27.


This is a legitimate question but, how close are you guys sitting to your monitors?

We used to have a 34in TV as our main tv for years...I can't imagine putting it 2-3 feet from my face.


----------



## xarot

I currently have a 27" PG279Q and I have to sit really close to the monitor due to apartment restrictions, it's actually a bit too big because I need to turn my head a bit. Gonna get this one probably.


----------



## TheCautiousOne

Quote:


> Originally Posted by *Yvese*
> 
> All I gotta say is if you pay anywhere near $1k for this you're crazy..
> 
> I bought my 65KS8000 for $1079. To pay near that price for a 27" monitor is madness. $600 is where I would draw the line, and even then that would be for 32" not 27.


You should hop in the KS8000 Thread! We would love to have you.

I bought the 49"


----------



## duckweedpb7

I guess I will post in this thread too. 27" makes me sad









x34 and 40 in 4k (samsung tv) have spoiled me


----------



## zehoo

New gsync module, maybe some kind soul can slap one of these on a pwm free 40-43" HDR va/oled panel and sell it to us.


----------



## Angry-Hermit

Yeah i cant image sitting a foot an a half away from a tv, this monitor would be great for my uses.


----------



## Silent Scone

No 21:9 HDR panels yet.


----------



## Vesimas

I was hoping for a 34" wide


----------



## ChaosAD

Please PLEASE make this 32"-34" and its instant buy for me!









PS. I think the price will be insane though


----------



## Swolern

Price? My guess is arm leg and left nut.


----------



## Dagamus NM

Yep, pricing will be high and it will take a pair of SLI Titan XPs or 1080Tis to drive it.

But 2018, the second iteration of this should see a larger panel with some of the issues that this one will present resolved. Yes, make it in 32 or 34 inch and I will need to get a wider room with a wider desk to run three of them.


----------



## Nightbird

A lot of pixels at a lot of hz, still prefer VA though for the contrast


----------



## FattysGoneWild

Price. If wondering and asking. You cant afford it. Move along. Nothing to see here. This thing will be well over $1k with specs like that. Awesome specs though.


----------



## Malinkadink

The 32" 4k 60hz Gsync Acer was $1200 at launch i believe, i actually see it going for more than that at some places. In any case this monitor will be between $1-2k for sure. I won't be buying because IPS, and 27" is too small for 4k.


----------



## darealist

IPS is the better all-around screen for LCD. VA only has better contrast, but still mediocre at 3000~7000:1. Everything else, IPS does better.


----------



## denman

I wish someone would make a 25", IPS, Gsync (or Freesync 2 at this point), 144Hz, 1440p.


----------



## EniGma1987

Quote:


> ASUS announces Swift PG27UQ 4K IPS 144Hz G-Sync HDR monitor


OMGOMGOMGOMGOMGOMGOMGOMGOMGOMGOMGOMGOMGOMG
Need it.


----------



## loader963

If this had been 32" it would have been perfect for me. As is sticking with the pg348q.


----------



## decoy11

What is G-Sync HDR?

The press release state it has HDR but which HDR standard does it support? I only know of HDR10 and Dolby Vision but Wikipedia has a few more.

Also would like to know what black level it can go down to since it is IPS and those have glow problems.

They didn't specify the response time on the monitor too.


----------



## Robilar

I don't get the appeal of this at all frankly... 4K on a 27" is massive overkill.

Now if they brought that out in a 34" or 35" I could definitely see why it would be popular.

Also would prefer to see it in a VA panel over IPS. Viewing angles don't really matter much in gaming where you are going to be sitting front and center.


----------



## EniGma1987

Quote:


> Originally Posted by *Wildcard36qs*
> 
> I agree with others that the size is too small. 4K should be at least 32". The price is going to be outrageous. Sucks that my 65" HDR TV is cheaper than a 27" monitor...


One thing that drives up cost though is backlight dimming zones. Most TVs use 20-60 zones at most, some of the brand new $2000+ TVs use stuff like 256-384 zones. This monitor uses 384 dimming zones which is way better for computer use and also costs a somewhat substantial amount for the number of LEDs, the circuits they go on to and the controllers for that many zones. On top of which the controllers have to be the latest and greatest to have any kind of decent input lag for computer use So that, plus DCI-P3 color gamut support, Quantum Dot stuff, and being the first and only 4K 144hz monitor plus the cost of the gsync module means it actually does somewhat justify a price of around $1200. That is way more tech than any TV has under $2000. This is really the first time I have seen a computer monitor with more tech than the TVs in those high price ranges, so for once the price will probably be justified.


----------



## Yvese

Quote:


> Originally Posted by *TheCautiousOne*
> 
> You should hop in the KS8000 Thread! We would love to have you.
> 
> I bought the 49"


Nice. I have the 65" for my living room and the 49" as well for my game room for Pro/Xbone/PC gaming









As for that OCN thread, I'll check it out. I actually lurk on the avsforums for the KS8000 just to keep up on any new firmware/news.

Back OT though, I don't care how much tech this has. No 27" monitor should cost the same as a high-end TV that's more than twice the size.

The only 'premium' this has is the overpriced gsync module. Freesync > Gsync in this regard. Another reason I hope Vega does well so I can stop supporting Ngreedia.


----------



## TAr

Release dateband price?


----------



## alexp247365

32 inch - 4k screens with the same tech incoming .... in 6 more months







.


----------



## juano

Quote:


> Originally Posted by *alexp247365*
> 
> 32 inch - 4k screens with the same tech incoming .... in 6 more months
> 
> 
> 
> 
> 
> 
> 
> .


Yep, here's the PCPer article on that monitor with some of the same tech (though without Gsync) for anyone interested. That monitor is expected Q3 for $1,799-$1,999 but based on it being a Pro Art I'd expect the PG27UQ to be less expensive, but $1,300-$1,500 probably isn't out of the question unfortunately.

I'm still interested in this monitor though, assuming that QC is ok and that FALD looks good (I haven't seen it in person) then this checks everything I'm looking for in a new monitor.


----------



## c0nsistent

Now I'll be waiting for the Korean generic version of this monitor to hit ebay and I'll be all over it for ~$400 USD


----------



## Leopardi

Quote:


> Originally Posted by *Iching*
> 
> What's up with a thick bezel? That resolution is way too small for a 27". The design isn't bad overall but I still prefer PG279Q looks.


There's this thing called scaling in Windows 10, you can make it look like how 1440p looks. 163PPI would be perfect for playing without AA, and a 4K monitor allows the native 1080p option before GPU's get strong enough.


----------



## JackCY

27" for what probably is going to be an insane price, no thanks even if I was going to spend that much money on a monitor.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Xuvial*
> 
> It'll take nothing short of 1070 SLI (minimum) to make the most out of this monitor.


I would like to slap a flagship GPU behind a ~30" 4K 144 Hz. Although I am heavily leaning towards wanting VA panel over IPS, even though I have had amazing luck/success with IPS panels.


----------



## meowth2

too bad this is another 27" i remember going from 24" to 27" and felt no differences. only difference i felt was smaller harder to read text


----------



## un1b4ll

I would like this to be 30 inch, but even at 27, 4k is not too much. It's a regular occurrence in Squad at 1440p to be firing at a target that's ~5px wide, so this bad boy will be big gains for my marksmanship.


----------



## KenjiS

Quote:


> Originally Posted by *meowth2*
> 
> too bad this is another 27" i remember going from 24" to 27" and felt no differences. only difference i felt was smaller harder to read text


Yeah i wish it was a 32" as well

Maybe also because i dont want to cry I just got my 279Q

Really doesnt matter.. This things gonna be $1500 probubly


----------



## Murlocke

If this was OLED, would be willing to spend 10 grand or so. Sadly never will happen due to demand.


----------



## Benny89

What the hell is with this 27'' ??? I mean this is way too small. I would prefer to grab 1440p Ultrawide.

It should be AT LEAST 32" up to 40" with such resolution.

Close Asus, but not close enough. 32" 4k or 144hz 1440p Ultrawide and we can maybe make a deal.


----------



## d4n0wnz

Good luck driving 144 hz on 4k display on a game that isn't minecraft


----------



## loader963

Quote:


> Originally Posted by *d4n0wnz*
> 
> Good luck driving 144 hz on 4k display on a game that isn't minecraft


I'd buy this for the long term(6+ years). Would be worth it to me and probably a whole lot of others as well with all the features. Sadly <32" means no interest from me.


----------



## GoLDii3

Quote:


> Originally Posted by *EniGma1987*
> 
> One thing that drives up cost though is backlight dimming zones. Most TVs use 20-60 zones at most, some of the brand new $2000+ TVs use stuff like 256-384 zones. This monitor uses 384 dimming zones which is way better for computer use and also costs a somewhat substantial amount for the number of LEDs, the circuits they go on to and the controllers for that many zones. On top of which the controllers have to be the latest and greatest to have any kind of decent input lag for computer use So that, plus DCI-P3 color gamut support, Quantum Dot stuff, and being the first and only 4K 144hz monitor plus the cost of the gsync module means it actually does somewhat justify a price of around $1200. That is way more tech than any TV has under $2000. This is really the first time I have seen a computer monitor with more tech than the TVs in those high price ranges, so for once the price will probably be justified.


Full Array LED is nothing but Direct Led backlight with a software that manages the LED's,so it's just another type of backlight and ultimately it's all up to the software,so it can end being useless anyway.

DCI-P3 support or 125% RGB like Samsung likes to call it on their monitor is nothing new. The only novelty is 4K at 144 Hz and HDR.


----------



## dukeReinhardt

Quote:


> Originally Posted by *GoLDii3*
> 
> Full Array LED is nothing but Direct Led backlight with a software that manages the LED's,so it's just another type of backlight and ultimately it's all up to the software,so it can end being useless anyway.
> 
> DCI-P3 support or 125% RGB like Samsung likes to call it on their monitor is nothing new. The only novelty is 4K at 144 Hz and HDR.


And QD.


----------



## meowth2

Quote:


> Originally Posted by *Murlocke*
> 
> If this was OLED, would be willing to spend 10 grand or so. Sadly never will happen due to demand.


yeah, i won't be buying any monitors until they come out oled with at least over 30"


----------



## Sem

4k @ 27inch wont be nice and since all the manufacturers use the same panel we are unlikely to see 30+ versions from Acer etc

and i don't like the stand prefer the SWIFT style over that

sigh


----------



## meowth2

Quote:


> Originally Posted by *KenjiS*
> 
> Yeah i wish it was a 32" as well
> 
> Maybe also because i dont want to cry I just got my 279Q
> 
> Really doesnt matter.. This things gonna be $1500 probubly


yeah, hold on to that until they come out with oled 32", anything else is not worth the money


----------



## guttheslayer

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Price. If wondering and asking. You cant afford it. Move along. Nothing to see here. This thing will be well over $1k with specs like that. Awesome specs though.


The last thing u forget is that it take asus close to 1 years to get their monitor to be made widely available to the market. Chill guys!

Meanwhile this monster should have be at least 30" to make it 4k worth.


----------



## DVLux

Quote:


> Originally Posted by *Wildcard36qs*
> 
> I agree with others that the size is too small. 4K should be at least 32". The price is going to be outrageous. Sucks that my 65" HDR TV is cheaper than a 27" monitor...


Actually, the price would more outrageous if it was larger. Monitor price _always_ scales with size.

Also, bigger is _not_ better.


----------



## KenjiS

Quote:


> Originally Posted by *GoLDii3*
> 
> Full Array LED is nothing but Direct Led backlight with a software that manages the LED's,so it's just another type of backlight and ultimately it's all up to the software,so it can end being useless anyway.
> 
> DCI-P3 support or 125% RGB like Samsung likes to call it on their monitor is nothing new. The only novelty is 4K at 144 Hz and HDR.


To be 100% fair this is the first monitor i can think of with FALD... In fact i had to reread that bit a few times

if it works, and i imagine it should, then it should produce a panel with noticably better shadows and very respectable black level

I may not want to buy it, but i DO want it to do well in the market

Would love OLED obviously (Heck im clinging DESPERATELY to my Samsung F8500 right now and praying it lasts till I can afford an LG OLED) but i dont think OLED monitors will be coming very soon at this rate (It has nothing to do with burn in, How long did we ALL have CRT screens? Burn in isnt the issue, the more realistic issues are just lack of panels on the market in the necessary sizes right now)


----------



## s1rrah

Quote:


> Originally Posted by *Leopardi*
> 
> There's this thing called scaling in Windows 10, you can make it look like how 1440p looks. 163PPI would be perfect for playing without AA, and a 4K monitor allows the native 1080p option before GPU's get strong enough.


Are you saying that 4K screens can play 1080p with no resampling? If so, I did not know that ...that would sure be nice ..


----------



## s1rrah

Quote:


> Originally Posted by *Malinkadink*
> 
> At what settings?


100fps ultra, of course...


----------



## EniGma1987

Quote:


> Originally Posted by *GoLDii3*
> 
> Full Array LED is nothing but Direct Led backlight with a software that manages the LED's,so it's just another type of backlight and ultimately it's all up to the software,so it can end being useless anyway.
> 
> DCI-P3 support or 125% RGB like Samsung likes to call it on their monitor is nothing new. The only novelty is 4K at 144 Hz and HDR.


You dont just say "turn these off now" and it happens, you have to have channels for the LEDs to be on. If you just plug all your LEDs into a driver they will all be on 1 channel, you need 384 driver channels to have 384 independent controlled zones.

Quantum Dot tech is not brand new, but it does require additional work and materials to implement and that stuff costs money.


----------



## Derpinheimer

Quote:


> Originally Posted by *Iching*
> 
> What's up with a thick bezel? That resolution is way too small for a 27". The design isn't bad overall but I still prefer PG279Q looks.


Eh, I originally thought 1440p was too much for 27", but after using it for 4years+, I want more. 4k would probably be perfect. Of course, I would never buy this as I can't drive 165fps @ 1440p @ ultra anyway


----------



## Z Overlord

Quote:


> Originally Posted by *Leopardi*
> 
> There's this thing called scaling in Windows 10, you can make it look like how 1440p looks. 163PPI would be perfect for playing without AA, and a 4K monitor allows the native 1080p option before GPU's get strong enough.


But I thought Nvidia scaling is only a really poor bilinear filter? AFAIK, Nvida nor Windows has native nearest neighbor scaling or anything fancy like that.


----------



## Leopardi

Quote:


> Originally Posted by *Z Overlord*
> 
> But I thought Nvidia scaling is only a really poor bilinear filter? AFAIK, Nvida nor Windows has native nearest neighbor scaling or anything fancy like that.


So "native" 1080p is not possible on 4K PC monitors, why have I been hearing otherwise?


----------



## ILoveHighDPI

Here we go!
As long as it's not horribly defective I'd pay up to One Grand for this, of course I shouldn't say that, but given the current prices of similar monitors I'm not expecting any miracles.

The really interesting thing will be if there's a Freesync version for cheaper, or just something with good strobing.

Quote:


> Originally Posted by *Xuvial*
> 
> It'll take nothing short of 1070 SLI (minimum) to make the most out of this monitor.


I can run plenty of games at 4K and 100hz+ with my 980Ti, of course not all of them, but the idea that you have to be able to run "everything" at "maximum" is, just like the idea of running all graphical effects at maximum all the time, a bit silly.


----------



## Swolern

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Price. If wondering and asking. You cant afford it. Move along. Nothing to see here. This thing will be well over $1k with specs like that. Awesome specs though.


That's the dumbest thing I have ever heard? Because your conscious on pricing means you can't afford it? Lol.


----------



## CallsignVega

Interesting. So this would mean NVIDIA created an entirely new Displayport 1.3/1.4 G-Sync chip.

Not caring for 27" though, doesn't really show off the benefit of 4K.

This new G-Sync chip may be the only way to get DP 1.3/1.4 speeds for a while until generic T-Con's come out that will work with Freesync. The G-Sync / Freesync battle is just ramping up.


----------



## FattysGoneWild

Quote:


> Originally Posted by *Swolern*
> 
> That's the dumbest thing I have ever heard? Because your conscious on pricing means you can't afford it? Lol.


Right. That is why I game with a 4790k, GTX 1080 and Dell S2716DG monitor. And I HATE this Dell as it is because of the size. I listened to everyone oh 27" is awesome blah blah. But using a 24" for so long. I am use to that and much prefer it. So not about price or able to "afford" it. I cant game with a monitor that big in front of my face. I am actually thinking of getting rid of this 27" and moving back down to a 24". The smaller version of the same monitor.


----------



## oxidized

Why is people hoping for bigger monitors? 27" will give you insane quality at 4k, what's the problem?


----------



## The Robot

I wonder who is supplying B- grade panels this time. The quality must be _spectacular_ to reach 144hz at 4k. Also, you'll need 2x1080Ti to even use this thing properly.


----------



## HYPERDRIVE

NEED = 34" 3440x1440 HDR G-SYNC 144HZ go go go


----------



## Seyumi

Quote:


> Originally Posted by *oxidized*
> 
> Why is people hoping for bigger monitors? 27" will give you insane quality at 4k, what's the problem?


Because it's the same thing as announcing 1080p 144hz gsync monitors on a whopping 13" panel for your desktop. It's to small and we sure as hell know that games/windows is no where close to figuring out scaling issues. Advancements in resolution should come advancements in size with at least equal or better clarity of the last generation. A 40" 4K monitor has the same PPI as a 1440p 27" monitor I used for half a decade. Something like 32-37" would give me a bigger screen and greater clarity.


----------



## oxidized

Quote:


> Originally Posted by *Seyumi*
> 
> Because it's the same thing as announcing 1080p 144hz gsync monitors on a whopping 13" panel for your desktop. It's to small and we sure as hell know that games/windows is no where close to figuring out scaling issues. Advancements in resolution should come advancements in size with at least equal or better clarity of the last generation. A 40" 4K monitor has the same PPI as a 1440p 27" monitor I used for half a decade. *Something like 32-37" would give me a bigger screen and greater clarity*.


Yes and less PPIs


----------



## CallsignVega

http://edgeup.asus.com/2017/01/04/rog-pg27uq-intro/

Oh wow, not till Q3. Jeez. So in Asus timeline that means Q4.


----------



## WhiteWulfe

Definitely interesting... Hopefully this means 144Hz ultrawides soon ^_^


----------



## guttheslayer

Quote:


> Originally Posted by *CallsignVega*
> 
> http://edgeup.asus.com/2017/01/04/rog-pg27uq-intro/
> 
> Oh wow, not till Q3. Jeez. So in Asus timeline that means Q4.


See i told ya! And worldwide availability will add another quarter. 2018.

And is this display g sync as well?


----------



## juano

$1199 is not too bad though.

The Asus link above has more information, but here's an article from Nvidia too that mentions an Acer monitor as well.

Ouch, some bad news from the Nvidia article:
Quote:


> Created in concert with *AU Optronics*, G-SYNC HDR displays are designed


----------



## CallsignVega

I'm just glad NVIDIA is making a faster G-Sync module. The Acer actually looks better IMO. Going to be a LONG wait for the new king of gaming displays at the end of this year.


----------



## jprovido

give my 144hz 3440x1440 I will buy it in a heart beat


----------



## KenjiS

Quote:


> Originally Posted by *juano*
> 
> $1199 is not too bad though.
> 
> The Asus link above has more information, but here's an article from Nvidia too that mentions an Acer monitor as well.
> 
> Ouch, some bad news from the Nvidia article:


Agreed... $1199 is actually not that big to swallow, the 279 started at $999 iirc.. and this is 4k, quantum dot, HDR local dimming etc...


----------



## twitchyzero

who's making the panel?
Quote:


> ///marketing: incredible contrast, deep saturated colors


IPS..i doubt it


----------



## Kinaesthetic

Quote:


> Originally Posted by *twitchyzero*
> 
> who's making the panel?
> IPS..i doubt it


AU Optronics. AHVA panel (IPS type).

As for incredible contrast, it probably won't hit OLED levels, but with 384 zones of FALD, it will have one heck of good contrast for an IPS type. And it is DCI-P3 compliant (100% sRGB) with HDR10 support.

So why on earth do you doubt it? They flat out gave the specs right there.


----------



## fleetfeather

Quote:


> Originally Posted by *twitchyzero*
> 
> who's making the panel?
> IPS..i doubt it


Quote:


> On top of that, it taps *quantum dots and HDR* to produce brighter colors and deeper blacks that make games and other content look more vibrant and lifelike.


http://edgeup.asus.com/2017/01/04/rog-pg27uq-intro/


----------



## twitchyzero

i'm an oddball that goes by AdobeRGB
kinda holding out for OLED but this is a great leap forward from the taiwanese manufacturer....this is going to sell well even with a prohibitive tag.


----------



## Profiled

Its lg ips...


----------



## caenlen

Quote:


> Originally Posted by *TheCautiousOne*
> 
> I use a 32" Crossover 324k and I love, Love, the Size.
> 
> 
> 
> 32" Is on the Left, 27" 1440p Qnix is on the Right.
> 
> I've had 24" (Typing on it at work to you right now)
> 
> Used the 27" for a Good 2 years, and the 32" for over 6 months at this point.
> 
> TCO


32" that close to my keyboard/eyes is a little overwhelming, personally I am happy with the 27"


----------



## bazh

Quote:


> Originally Posted by *twitchyzero*
> 
> who's making the panel?
> IPS..i doubt it


It has FALD, which really helps with the contrast ratio.


----------



## wizardbro

Give me all of this in a ultrawide please. 16:9 is boring.


----------



## Defoler

Quote:


> Originally Posted by *jbmayes2000*
> 
> Guru3d posted this article but then retracted it. Couldn't find it again until this one. This looks interesting. I guess the new cards need HDMI2.1, correct?


I expect it will be dp 1.3 or 1.4. Nvidia already support it, and it has the required bandwidth.

Edit: it is dp 1.4 and has hdmi for uhd playpack, so i expect it will be 2.0 only. 2.1 only just came out.


----------



## fleetfeather

Quote:


> Originally Posted by *wizardbro*
> 
> Give me all of this in a ultrawide please. 16:9 is boring.


21:9 is great in 15% of the content I've encountered that uses it.

Nothing beats a bit of yellow BLB in all four corners of my black bars.


----------



## profundido

Shame it's gonna be 27" again. That's a dealbreaker for me. If it was 32"-34" I would have bought it immediately but as long as windows and program scaling still remains as it is I will prefer a bigger screen for more real estate with less or no scaling for 4K.

If gaming is your only worry then it doesn't matter I guess.

Shame, this means I most likely will have to wait for Acer to make a 32" predator version of this since Asus never does


----------



## n4p0l3onic

Quote:


> Originally Posted by *fleetfeather*
> 
> 21:9 is great in 15% of the content I've encountered that uses it.
> 
> Nothing beats a bit of yellow BLB in all four corners of my black bars.


Why are you complaining with free ambient light?


----------



## fleetfeather

Quote:


> Originally Posted by *n4p0l3onic*
> 
> Why are you complaining with free ambient light?


It's just like bias lighting, as long as every scene is pale yellow


----------



## JbstormburstADV

If only they'd announce the FreeSync 2 version... That would be a monitor I would be all over.


----------



## hrockh

27 is the perfect size for me















Hope it goes into production


----------



## Nestala

Will this have the astonishing quality control like their other monitors have?
If yes BLB is gonna be horrible on these.


----------



## Lass3

Another AU Optronics abomination panel?


----------



## valentyn0

Quote:


> Originally Posted by *Lass3*
> 
> Another AU Optronics abomination panel?


Say again?


----------



## caenlen

I'm holding off buying anything until HBM2 gpu's hit, otherwise, it's just silly. Mmmm gimme da power!!!!


----------



## Pragmatist

I would normally get very excited about this, but after having a lot of issues with the PG279Q I'll definitely not buy another ASUS IPS panel. Lesson learned, I guess.


----------



## Lass3

Quote:


> Originally Posted by *Pragmatist*
> 
> I would normally get very excited about this, but after having a lot of issues with the PG279Q I'll definitely not buy another ASUS IPS panel. Lesson learned, I guess.


AU Optronics panel.

I've seen at least 10 monitors with AUO panels in the last year and all of them had major issues, especially bleed and glow issues. All 1440p/1xx Hz monitors. And the IQ is far from LG AH-IPS quality imo.

Better than TN, but worse than proper IPS and VA.


----------



## Profiled

Hnggg ips like panel AHVA. Love the marketing...

Hdr future is interesting, pc game that supports hdr? Mods?


----------



## Pragmatist

Quote:


> Originally Posted by *Lass3*
> 
> AU Optronics panel.
> 
> I've seen at least 10 monitors with AUO panels in the last year and all of them had major issues, especially bleed and glow issues. All 1440p/1xx Hz monitors. And the IQ is far from LG AH-IPS quality imo.
> 
> Better than TN, but worse than proper IPS and VA.


Yes, I'm aware that they make the panel. However, ASUS in turn accepts these panels and sells them, and that to me is unacceptable as a long time ASUS customer. Everyone that has these AU panels can surely one way or the other attest to them being underwhelming, and that's mostly because of the plethora of issues they have.

If you RMA the monitor you'll just get another one that is just as bad if not worse. It's supposed to be a "premium" monitor. and for $800 or more it should be.


----------



## Lass3

Quote:


> Originally Posted by *Pragmatist*
> 
> Yes, I'm aware that they make the panel. However, ASUS in turn accepts these panels and sells them, and that to me is unacceptable as a long time ASUS customer. Everyone that has these AU panels can surely one way or the other attest to them being underwhelming, and that's mostly because of the plethora of issues they have.
> 
> If you RMA the monitor you'll just get another one that is just as bad if not worse. It's supposed to be a "premium" monitor. and for $800 or more it should be.


Acer's monitors have same issues. All "IPS" (AHVA) 1440p monitors using these panels have these issues, some more than others, but I have yet to see a somewhat close to perfect version.

Asus can sell the monitors, or not. I know many people that bought them, even tho they have issues, so obviously Asus sells them, the buyers are there..
Many owners speak highly of them, because they come from 1080p TN and those are faaaaar from perfect.

People that want high Hz, are forced to settle with worse IQ. It's a compromise they are willing to make. After all, there's no perfect monitor, or TV for that matter.


----------



## Sedolf

384 zone dimming means a direct backlight array...
So that should help with uniformity issues, glow and blb


----------



## TK421

waiting for freesync version


----------



## ToTheSun!

Quote:


> Originally Posted by *Sedolf*
> 
> 384 zone dimming means a direct backlight array...
> So that should help with uniformity issues, glow and blb


Especially glow. People in this thread are saying they would never buy another AHVA monitor, but dimming zones effectively fix the one problem they had with the previous batch of 1440p 144 Hz AHVA monitors.


----------



## Pantsu

Direct LED backlights have other issues though, clouding and unevenness. We'll just have to see how well the 384 zone dimming works out. Personally I'm not interested in 27" displays, so hopefully we'll see something like this in 32" and bigger, although I'm thinking I'll start saving for an HDMI 2.1 HDR TV instead.







Looks like my BDM4065UC will have to suffice for another year at least.


----------



## TheCautiousOne

Quote:


> Originally Posted by *Yvese*
> 
> *Nice. I have the 65" for my living room and the 49" as well for my game room for Pro/Xbone/PC gaming
> 
> 
> 
> 
> 
> 
> 
> *
> 
> As for that OCN thread, I'll check it out. I actually lurk on the avsforums for the KS8000 just to keep up on any new firmware/news.
> 
> Back OT though, I don't care how much tech this has. No 27" monitor should cost the same as a high-end TV that's more than twice the size.
> 
> The only 'premium' this has is the overpriced gsync module. Freesync > Gsync in this regard. Another reason I hope Vega does well so I can stop supporting Ngreedia.




TCO


----------



## chrisnyc75

The combination of 4K, HDR, 120hz, & G-Sync is a welcome advance a LONG time in the making, but for me personally I'll wait for the HDR+120hz+Gsync combo on a 21:9 ultrawide. I went ultrawide last year, and while there is a content/compatibility problem with a lot of games, the games that work with it look fantastic. If the new hardware can finally push 4k @ 120hz , 3440 @120 w/ HDR can't be far behind.


----------



## velocd

I don't understand all the whining about it being 27". Hello, DPI scaling? Is it mostly coming from people still on Windows 7 with its crappy DPI scaling?

27" is better for 4K gaming on a desk. You don't even need anti-aliasing in games with the pixel density. Monitors any bigger, that aren't wide screen, are silly for desk viewing. Get a damn TV.


----------



## rvectors

Quote:


> Originally Posted by *velocd*
> 
> I don't understand all the whining about it being 27". Hello, DPI scaling? Is it mostly coming from people still on Windows 7 with its crappy DPI scaling?
> 
> 27" is better for 4K gaming on a desk. You don't even need anti-aliasing in games with the pixel density. Monitors any bigger, that aren't wide screen, are silly for desk viewing. Get a damn TV.


Pretty much sums it up perfectly.


----------



## l88bastar

Quote:


> Originally Posted by *Wildcard36qs*
> 
> I agree with others that the size is too small. 4K should be at least 32". The price is going to be outrageous. Sucks that my 65" HDR TV is cheaper than a 27" monitor...


Move it closer to your face

Pretend its a larger display cause its so close to your eyeballs

.............

Profit??????


----------



## Ha-Nocri

This must be using 2 DP's for 144Hz? I thought DP 1.4 is only enough for 4k 60Hz HDR


----------



## TooBAMF

Quote:


> Originally Posted by *velocd*
> 
> I don't understand all the whining about it being 27". Hello, DPI scaling? Is it mostly coming from people still on Windows 7 with its crappy DPI scaling?
> 
> 27" is better for 4K gaming on a desk. You don't even need anti-aliasing in games with the pixel density. Monitors any bigger, that aren't wide screen, are silly for desk viewing. Get a damn TV.


Unlikely that TVs will be made with real 120 or 144 Hz refresh rates anytime soon. Also I doubt any of them would have GSync or FreeSync. Nobody would be complaining about 27" if there were viable TV alternatives.


----------



## ILoveHighDPI

Quote:


> Originally Posted by *Ha-Nocri*
> 
> This must be using 2 DP's for 144Hz? I thought DP 1.4 is only enough for 4k 60Hz HDR


They probably switch between 8 bit and 10 bit HDR modes.

We're still going to have to wait a while before HDR is common in games.


----------



## ToTheSun!

Quote:


> Originally Posted by *l88bastar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Wildcard36qs*
> 
> I agree with others that the size is too small. 4K should be at least 32". The price is going to be outrageous. Sucks that my 65" HDR TV is cheaper than a 27" monitor...
> 
> 
> 
> Move it closer to your face
> 
> Pretend its a larger display cause its so close to your eyeballs
> 
> .............
> 
> Profit??????
Click to expand...

No! Trigonometry is stupid!


----------



## Dhoulmagus

Quote:


> Originally Posted by *velocd*
> 
> I don't understand all the whining about it being 27". Hello, DPI scaling? Is it mostly coming from people still on Windows 7 with its crappy DPI scaling?
> 
> 27" is better for 4K gaming on a desk. You don't even need anti-aliasing in games with the pixel density. Monitors any bigger, that aren't wide screen, are silly for desk viewing. Get a damn TV.


My 27" looks gigantic sitting at the desk, it's hard to get used to, granted I come from a time when a 13" commodore 1702 was a great monitor...

I wouldn't mind the increase to 4k as I can still see some pixelation. I play games on a 40" 1080P screen and it looks totally fine.. because I sit back at least 5-10 feet. 27 or 28" 4k 144hz is where I'd like to be by 2020







, MAYBE I'll go as high as 32", but I'll need a deeper desk









All irrelevant talk for me until some graphics cards hit the market that I feel worthy of my money. My 280X can't exactly handle 1440P at even 60FPS as it is on modern games







.. Now I'll be waiting for something that can do 4k at 144FPS.. haha. I do love the feel of the desktop at 144hz though, almost seems as smooth as a decent CRT, but not quite there yet.


----------



## Excession

It isn't OLED, but other than that it's perfect. I'll buy this in an instant if the price tag has three digits on it. Or maybe the inevitable freesync monitor with this panel, depending on how big Vega turns out.

Hope AUO's managed to reduce their QC problems.
Quote:


> Originally Posted by *Seyumi*
> 
> Because it's the same thing as announcing 1080p 144hz gsync monitors on a whopping 13" panel for your desktop. It's to small and we sure as hell know that games/windows is no where close to figuring out scaling issues. Advancements in resolution should come advancements in size with at least equal or better clarity of the last generation. A 40" 4K monitor has the same PPI as a 1440p 27" monitor I used for half a decade. Something like 32-37" would give me a bigger screen and greater clarity.


The point of a monitor like this isn't to increase your screen real estate. The point is to make it so that it's less like you're looking through a screen door. This is a solid 50% improvement in PPI over a 27" 1440P monitor.

I'll pick up a 32" monitor when they come in 8K.


----------



## i7monkey

How long before a GPU can push ~80+ FPS on this at 4K? 3 years?


----------



## ahnafakeef

Quote:


> Originally Posted by *i7monkey*
> 
> How long before a GPU can push ~80+ FPS on this at 4K? 3 years?


Last time I checked, Titan XP SLI can quite handily do more than 80FPS at 4K already. So I think we'll see that single GPU in 2018 at the latest. Heck, it might even come out this year if there's a strong competition between Nvidia and AMD.


----------



## oxidized

Quote:


> Originally Posted by *velocd*
> 
> I don't understand all the whining about it being 27". Hello, DPI scaling? Is it mostly coming from people still on Windows 7 with its crappy DPI scaling?
> 
> 27" is better for 4K gaming on a desk. You don't even need anti-aliasing in games with the pixel density. Monitors any bigger, that aren't wide screen, are silly for desk viewing. Get a damn TV.


Finally someone who knows what's he talking about.


----------



## shredzy

The general hate on monitors is bloody hilarious!

We get 27" 1440P 144Hz monitor and people say "Nah, 4K at least please and then ill buy"!

Now we get this and people say "Its to small, 32" please!"

Next a 32" will come out, and people will want 8K 144Hz, its never ending. 27" is PLENTY for games, its borderline to big for FPS.


----------



## fleetfeather

Quote:


> Originally Posted by *shredzy*
> 
> The general hate on monitors is bloody hilarious!
> 
> We get 27" 1440P 144Hz monitor and people say "Nah, 4K at least please and then ill buy"!
> 
> Now we get this and people say "Its to small, 32" please!"
> 
> Next a 32" will come out, and people will want 8K 144Hz, its never ending. 27" is PLENTY for games, its borderline to big for FPS.


Pretty sure people are seeing the rate of monitor tech progression and just want to try and buy something that will remain relevant for an extended period of time like we're all accustomed to...

Until a) we all start treating monitors as a yearly upgrade, b) monitor tech plateaus again, or c) the price of high end monitors drops significantly, this sort of super critical mindset will continue


----------



## shhek0

Quote:


> Originally Posted by *shredzy*
> 
> The general hate on monitors is bloody hilarious!
> 
> We get 27" 1440P 144Hz monitor and people say "Nah, 4K at least please and then ill buy"!
> 
> Now we get this and people say "Its to small, 32" please!"
> 
> Next a 32" will come out, and people will want 8K 144Hz, its never ending. 27" is PLENTY for games, its borderline to big for FPS.


It is a tech forum and people here are enthusiasts which just want to see something that they know it would be a beast. And in order everything to be pushed into innovation and such- yeah there should be people who are saying that the new is just not enough.

I, myself, would never put more than a 24' so close to my eyes and that is the reason I am just skipping many topics on this forum ( as well as the ancient rig







)


----------



## ILoveHighDPI

Quote:


> Originally Posted by *shredzy*
> 
> The general hate on monitors is bloody hilarious!
> 
> We get 27" 1440P 144Hz monitor and people say "Nah, 4K at least please and then ill buy"!
> 
> Now we get this and people say "Its to small, 32" please!"
> 
> Next a 32" will come out, and people will want 8K 144Hz, its never ending. 27" is PLENTY for games, its borderline to big for FPS.


It's just remnants of old ways of thinking.

Four years ago half the people posting here would entirely reject the idea of 4K, firstly because they said it was "impossible" to see the difference, and then because it's "impossible" for any games to run well at that resolution.
Both points are easily proven wrong today, but that level of negativity doesn't disappear overnight.
Now we're at the point where most people who previously opposed 4K have at least developed some sort of idea as to what would make 4K acceptable to them.


----------



## 12Cores

Well it is finally here [email protected], I would expect that they will ask nothing less than $2,000 for this miracle of modern technology. I wonder how may years if will take for something like this to retail for $399?


----------



## Robilar

Quote:


> Originally Posted by *shredzy*
> 
> The general hate on monitors is bloody hilarious!
> 
> We get 27" 1440P 144Hz monitor and people say "Nah, 4K at least please and then ill buy"!
> 
> Now we get this and people say "Its to small, 32" please!"
> 
> Next a 32" will come out, and people will want 8K 144Hz, its never ending. 27" is PLENTY for games, its borderline to big for FPS.


27" is fine but since going to a curved 35", there is no way i would go back. BF1 with the monitor and my 5.1 speakers is very immersive.

http://s1201.photobucket.com/user/RobilarOCN/media/DSC01674_zpsav9qgwq5.jpg.html


----------



## CallsignVega

Quote:


> Originally Posted by *12Cores*
> 
> Well it is finally here [email protected], I would expect that they will ask nothing less than $2,000 for this miracle of modern technology. I wonder how may years if will take for something like this to retail for $399?


Well, I don't think Q4 really qualifies as "finally here". As for $400, maybe 2022?


----------



## inedenimadam

Monitor tech is flying...

About dang time. I am going to sit on 4k 60hz for a while longer, but it is good to know that when I am ready to upgrade displays, I will be blown away.


----------



## Arizonian

This happened a lot sooner than I thought it would.

Quote:


> Originally Posted by *velocd*
> 
> I don't understand all the whining about it being 27". Hello, DPI scaling? Is it mostly coming from people still on Windows 7 with its crappy DPI scaling?
> 
> 27" is better for 4K gaming on a desk. You don't even need anti-aliasing in games with the pixel density. Monitors any bigger, that aren't wide screen, are silly for desk viewing. Get a damn TV.


Exactly.

DPI scaling on windows 10 is a 2 second fix for desktop view and it doesn't degrade resolution. So you have the best of both worlds.

I read others opinions on 27" and it's just their opinions. I have no issues with 27" gaming. Pixels are so tight that objects very far away are very clear. I max FOV on every game. Not to mention steaming is brilliant on 27" with pixel density at 3840x2160.

Not knocking wider monitors just reiterating there's nothing wrong with 27" 4K monitors that others will have you think there is. Affordability while reaching more end users is a sound business choice.

Quote:


> Originally Posted by *i7monkey*
> 
> How long before a GPU can push ~80+ FPS on this at 4K? 3 years?


I'd concur 3 years at best IMO.

It's takes 1080's in SLI just to run BF4 maxed settings at 4K to get 98 FPS at best, not even taking advantage of the monitors full capabilities.

Even though I'd love a freesync version of this I don't have that kind of cash to keep up with two GPU's to fully enjoy it and trying to keep up with gaming as it gets more demanding. So I'm holding off on 4K 144Hz until one GPU or at the very least 2 mid level GPU's can make that happen.

I can't wait for Vega which may actually be able to run my 4K 60 Hz monitor in the 50-60 FPS zone.


----------



## DADDYDC650

27" for $1200? No thanks! I paid under $700 for my Samsung 49ks8000. Sure it only does 60Hz but it looks amazing, twice as big and costs half. I need a 34"-38" 1440p, 144Hz Ultrawide with HDR.


----------



## KGPrime

Better not suck.


----------



## Sem

Quote:


> Originally Posted by *shredzy*
> 
> The general hate on monitors is bloody hilarious!
> 
> We get 27" 1440P 144Hz monitor and people say "Nah, 4K at least please and then ill buy"!
> 
> Now we get this and people say "Its to small, 32" please!"
> 
> Next a 32" will come out, and people will want 8K 144Hz, its never ending. 27" is PLENTY for games, its borderline to big for FPS.


god forbid people will want continual advances in technology

not like this is a enthusiast forum or anything


----------



## shredzy

Quote:


> Originally Posted by *Sem*
> 
> god forbid people will want continual advances in technology
> 
> not like this is a enthusiast forum or anything


God forbid people that don't understand my post.


----------



## st0necold

Quote:


> Originally Posted by *DADDYDC650*
> 
> 27" for $1200? No thanks! *I paid under $700 for my Samsung 49ks8000*. Sure it only does 60Hz but it looks amazing, twice as big and costs half. I need a 34"-38" 1440p, 144Hz Ultrawide with HDR.


the point of this thread was to announce the first 4k 144hz display.. The reason your monitor was $700 dollars was because it was 60hz.


----------



## st0necold

Quote:


> Originally Posted by *i7monkey*
> 
> How long before a GPU can push ~80+ FPS on this at 4K? 3 years?


980ti SLI can do 80+ at 4k so can 1080 sli, titan xp, and txp sli

waaaaaaaaaaaaaaa


----------



## st0necold

Quote:


> Originally Posted by *Excession*
> 
> It isn't OLED, but other than that it's perfect. I'll buy this in an instant if the price tag has three digits on it. Or maybe the inevitable freesync monitor with this panel, depending on how big Vega turns out.
> 
> Hope AUO's managed to reduce their QC problems.
> The point of a monitor like this isn't to increase your screen real estate. The point is to make it so that it's less like you're looking through a screen door. This is a solid 50% improvement in PPI over a 27" 1440P monitor.
> 
> I'll pick up a 32" monitor when they come in 8K.


http://www.theverge.com/2017/1/5/14098224/dell-up3218k-8k-computer-monitor-price-features-ces-2017


----------



## kikibgd

this is awesome


----------



## chrisnyc75

Quote:


> Originally Posted by *fleetfeather*
> 
> Pretty sure people are seeing the rate of monitor tech progression and just want to try and buy something that will remain relevant for an extended period of time like we're all accustomed to...
> 
> Until a) we all start treating monitors as a yearly upgrade, b) monitor tech plateaus again, or c) the price of high end monitors drops significantly, this sort of super critical mindset will continue


^^This. Recently, the high end of monitors has become rather expensive, and they're not particularly easy to resell. I think people are hesitant to get on board with this 27" display when they feel a 32" display is probably around the corner that will give them buyer's remorse. Trust me, I dropped almost $1k on a 60hz ultrawide about a year ago, and right about now that's not feeling like it was such a great investment.


----------



## jbmayes2000

Quote:


> Originally Posted by *Sem*
> 
> god forbid people will want continual advances in technology
> 
> not like this is a enthusiast forum or anything


I would venture to guess he isn't against the advancement of technology. He's more pointing to the fact that people ask for "x", "x" gets delivered and people jump right to wanting "y" instead of enjoying the fact that "x" now exists.

I could be wrong though.


----------



## DADDYDC650

Quote:


> Originally Posted by *st0necold*
> 
> the point of this thread was to announce the first 4k 144hz display.. The reason *your monitor* was $700 dollars was because it was 60hz.


We all know the purpose of this thread and I'm free to comment as I wish.

You don't even know what a 49KS8000 is obviously.


----------



## Mand12

Damn.

Guess this is what I'll upgrade to for my next major rig.


----------



## ZealotKi11er

IPS can do HDR? Since when did IPS have high contrast?


----------



## ToTheSun!

Quote:


> Originally Posted by *ZealotKi11er*
> 
> IPS can do HDR? Since when did IPS have high contrast?


384 dimming zones. You know, the thing we said monitors would need for HDR for the past year or two. I suppose they eventually listen.


----------



## DADDYDC650

Quote:


> Originally Posted by *ZealotKi11er*
> 
> IPS can do HDR? Since when did IPS have high contrast?


It doesn't as far as I know. Unless they are using some new IPS tech which is doubtful. I'm hoping for the best though. Just wish this was at least 32 inches.


----------



## wizardbro

https://www.computerbase.de/2017-01/asus-rog-swift-pg27uq-uhd-144-hz-displayport-1.4-g-sync-hdr/
Check the latest update.
Asus wants to sell this for 2k euros, LMAO.

No pricing info on this page anymore. http://edgeup.asus.com/2017/01/04/rog-pg27uq-intro/


----------



## ZealotKi11er

Quote:


> Originally Posted by *wizardbro*
> 
> https://www.computerbase.de/2017-01/asus-rog-swift-pg27uq-uhd-144-hz-displayport-1.4-g-sync-hdr/
> Check the latest update.
> Asus wants to sell this for 2k euros, LMAO.
> 
> No pricing info on this page anymore. http://edgeup.asus.com/2017/01/04/rog-pg27uq-intro/


Why not. Same people that bought their our G-Sync stuff will buy this. Money is not a problem for them. They want the best.


----------



## Robilar

I'd happily pay 1500+ for a 1440P version of my current monitor.

I have no interest in going to a 4k monitor, i have had far too many issuues over the years with crossfire and sli.


----------



## ToTheSun!

Quote:


> Originally Posted by *wizardbro*
> 
> https://www.computerbase.de/2017-01/asus-rog-swift-pg27uq-uhd-144-hz-displayport-1.4-g-sync-hdr/
> Check the latest update.
> Asus wants to sell this for 2k euros, LMAO.
> 
> No pricing info on this page anymore. http://edgeup.asus.com/2017/01/04/rog-pg27uq-intro/


Considering 384 zones is not nearly enough to create a display virtually capable of high contrast (not without omnipresent blooming, anyway), this at $2000 is a hard pass, especially considering native 120 Hz 4K OLED this year and other such options that will definitelly show up in the meantime.


----------



## flopper

Quote:


> Originally Posted by *wizardbro*
> 
> https://www.computerbase.de/2017-01/asus-rog-swift-pg27uq-uhd-144-hz-displayport-1.4-g-sync-hdr/
> Check the latest update.
> Asus wants to sell this for 2k euros, LMAO.
> 
> No pricing info on this page anymore. http://edgeup.asus.com/2017/01/04/rog-pg27uq-intro/


into price is always high until market saturate.


----------



## kikibgd

imho 900-1000e is more then enough for that monitor


----------



## ZealotKi11er

4K 144Hz is also pointless because i can go that high.


----------



## DVLux

Pretty sure there were two distinct, and different, specifications for HDR... It's not all about contrast, though.


----------



## ZealotKi11er

Quote:


> Originally Posted by *DVLux*
> 
> Pretty sure there were two distinct, and different, specifications for HDR... It's not all about contrast, though.


It has to go very black and very bright.


----------



## TheCautiousOne

Quote:


> Originally Posted by *DADDYDC650*
> 
> We all know the purpose of this thread and I'm free to comment as I wish.
> 
> You don't even know what a 49KS8000 is obviously.





















TCO


----------



## CallsignVega

I actually thought $1,200 seemed a bit low. $2,000 seems a bit high, I was thinking more around $1700.


----------



## TheCautiousOne

Quote:


> Originally Posted by *CallsignVega*
> 
> I actually thought $1,200 seemed a bit low. $2,000 seems a bit high, I was thinking more around *$1700*.


That's how much I paid for my COLT M4A1 Socom Ed Rifle.

TCO


----------



## QSS-5

Nordic hardware confirms it to cost $1199


----------



## CallsignVega

That is old info. The 1,999 Euro price is more recent.

They removed the $1,199 price from the bottom of their article:

http://edgeup.asus.com/2017/01/04/rog-pg27uq-intro/

So you definitely know there is a price increase.


----------



## KGPrime

I fully expect something like this to be closer to the 2k mark. An Eizo CG costs 2k without 144Hz or Gsync ect. So what could one theoretically expect a supposed 125% RGB Quantum Dot HDR capable 4k @ 144Hz with Gsync and Local Dimming monitor is going to cost. Over 2k wouldn't even surprise me. Under $1600 would make me suspect as to the quality of it, lol. Hell the fact that these are Asus/Acer monitors with supposed AUO panels, there's about 90% chance it turns out to be a pile of crap no matter the cost, and that might even be being lenient. So let's not get too excited.


----------



## Silent Scone

Quote:


> Originally Posted by *KGPrime*
> 
> I fully expect something like this to be closer to the 2k mark. An Eizo CG costs 2k without 144Hz or Gsync ect. So what could one theoretically expect a supposed 125% RGB Quantum Dot HDR capable 4k @ 144Hz with Gsync and Local Dimming monitor is going to cost. Over 2k wouldn't even surprise me. Under $1600 would make me suspect as to the quality of it, lol. Hell the fact that these are Asus/Acer monitors with supposed AUO panels, there's about 90% chance it turns out to be a pile of crap no matter the cost, and that might even be being lenient. So let's not get too excited.


Bingo on price. As for quality I don't know what you're basing that assumption on, you probably don't have much experience with high end consumer products.


----------



## wizardbro

Is this a 10-bit panel?


----------



## KGPrime

Quote:


> Originally Posted by *Silent Scone*
> 
> Bingo on price. As for quality I don't know what you're basing that assumption on, you probably don't have much experience with high end consumer products.


Most of the top end consumer monitors besides Eizo are carried by my local Frys for the last like 15 years and I've had hands on a lot of top end monitors yes. From 5k Dells to countless Asus gaming monitors to LG 21:9's to Dell 30" , to Samsungs to NECS. So i know enough to not have wasted my money on them, well, save for 2 NECs which i bought at said Frys over a decade ago and returned/sold in weeks and stuck with my fw900's.







Sound more like you are content with them, or aren't that picky. I find most peopel who only used lcds in their life or a really crappy crt find lcds just peechy. And people a little older or who have had the fortune to own an Fw900 or the like pretty much despise lcds. I can go into detail about, but there's hundred/thousands of posts on this very forum that can answer that for you already with pictures and graphs and video. You know, the same rehashed stuff over the last decade or more.

Anyway. Vid of said monitor, if there were any posted before it i hadn't seen it. Speaking of, you can hear this guys same sentiment at the start of the video. I know exactly what he's saying and what he means in detail just by those few words.


----------



## Dhoulmagus

Quote:


> Originally Posted by *KGPrime*
> 
> Most of the top end consumer monitors besides Eizo are carried by my local Frys for the last like 15 years and I've had hands on a lot of top end monitors yes. From 5k Dells to countless Asus gaming monitors to LG 21:9's to Dell 30" , to Samsungs to NECS. So i know enough to not have wasted my money on them, well, save for 2 NECs which i bought at said Frys over a decade ago and returned/sold in weeks and stuck with my fw900's.
> 
> 
> 
> 
> 
> 
> 
> Sound more like you are content with them, or aren't that picky. I find most peopel who only used lcds in their life or a really crappy crt find lcds just peechy. *And people a little older or who have had the fortune to own an Fw900 or the like pretty much despise lcds*. I can go into detail about, but there's hundred/thousands of posts on this very forum that can answer that for you already with pictures and graphs and video. You know, the same rehashed stuff over the last decade or more.
> 
> Anyway. Vid of said monitor, if there were any posted before it i hadn't seen it. Speaking of, you can hear this guys same sentiment at the start of the video. I know exactly what he's saying and what he means in detail just by those few words.


I still have an FW900 in the basement that gets light gaming and browsing use (Sony Trinitron 24" 16:10, 2300x1440, 80hz, instant response) .. Yes kids we had those specs in 2001! although I didn't own mine until around 2005. It's the most gorgeous monitor I've ever seen still to date. My Asus MG279Q seems close-ish but still not there in responsiveness and color, maybe the glass on the CRT helps give it an edge, or maybe its the smaller size.. I don't know. I say again though it is close, this screen has some light bleed so blacks are better on the trinitron, and bright whites seem to pop a bit more, besides that nothing really noticeable.

LCDs have made me vomit for years. The upgrade to 144hz and IPS really helped, but eh.. Maybe OLED or some other future panel tech will give me that old feeling the Trinitron gives back, 4k @ 144hz is finally pushing beyond what the FW900 could do, so we're getting there.

edit: fun fact, the FW900 is so heavy my first ever bank injury came from moving it


----------



## rvectors

I'm not sure I'd pay 1999 euros for any ASUS product, especially a monitor from them but the videos coming out of CES do look pretty sweet... I couldn't see any obvious BLB but of course it'll be the ones the CEO personally picked, with any failures followed by death or the pain chamber for those responsible.

This video is in Swedish I think but at around 4.01 there is a very bright white flash in the centre, the panel response, with surrounding blacks not washing out, looked damn good.


----------



## DADDYDC650

Quote:


> Originally Posted by *wizardbro*
> 
> Is this a 10-bit panel?


No mention of it being so. Doubt it can do HDR really well. I believe a display needs to be able to output 1000 nits, be 10-bit and have a high contrast ratio.


----------



## Malinkadink

Quote:


> Originally Posted by *DADDYDC650*
> 
> No mention of it being so. Doubt it can do HDR really well. I believe a display needs to be able to output 1000 nits, be 10-bit and have a high contrast ratio.


It has to be a 10 bit panel to be able to actually produce over a billion colors up from the 16 million an 8 bit panel does. If its 8 bit + FRC dithering then on that principal alone this display isn't worth paying the exuberant price that it will surely go for.


----------



## ZealotKi11er

Quote:


> Originally Posted by *DADDYDC650*
> 
> No mention of it being so. Doubt it can do HDR really well. I believe a display needs to be able to output 1000 nits, be 10-bit and have a high contrast ratio.


There are different HDR standard. The moral of the story is that if you want HDR you do not want to buy anything right now. Nothing can fully support HDR. OLED support the low level but cant go 1000 nit, VA can fo 1000+ nit but cant go low enough. Generally speaking its a new technology and in reality a gimmick at the current state.


----------



## Baasha

An excellent replacement for my Asus ROG Swift PG278Q 27" 1440P 144Hz monitor.

The question is, this display uses DP 1.4 which is great - how would running 3x of these be? Only one way to find out!









Time for some 4K 144Hz Surround!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Baasha*
> 
> An excellent replacement for my Asus ROG Swift PG278Q 27" 1440P 144Hz monitor.
> 
> The question is, this display uses DP 1.4 which is great - how would running 3x of these be? Only one way to find out!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Time for some 4K 144Hz Surround!


Does it really matter for peripheral vision to be 4K?


----------



## Dagamus NM

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does it really matter for peripheral vision to be 4K?


Yep.


----------



## Z Overlord

Quote:


> Originally Posted by *Leopardi*
> 
> So "native" 1080p is not possible on 4K PC monitors, why have I been hearing otherwise?


Because it's theoretically possible if nearest neighbor was used, but it's not. Basically Nvidia and Microsoft don't care at all. Some people might mistakenly believe that just having a screen with 4x the resolution automatically makes it scale as "native" like you said, but it does not. It's enabled on iOS though, which is how Apple transitioned into their "retina" screens while keeping older apps looking the same as before.


----------



## boredgunner

Too bad it's IPS. Complete waste of HDR and local dimming. Waste of a monitor I'll say.


----------



## mtcn77

Quote:


> Originally Posted by *boredgunner*
> 
> Too bad it's IPS. Complete waste of HDR and local dimming. Waste of a monitor I'll say.


Hold on, you were rooting for X34, but not the PG27?


----------



## boredgunner

Quote:


> Originally Posted by *mtcn77*
> 
> Hold on, you were rooting for X34, but not the PG27?


Uh, where was I ever rooting for the X34?


----------



## ILoveHighDPI

I'm actually kind of a fan of the Multi Resolution support in Shadow Warrior 2. It's definitely noticeable if you turn it up to "high" but on "medium" it's much more subtle. With a wide enough FOV the image is usually distorted pretty badly on the side anyway.

Extrapolate that technology far enough and it would probably be the ideal way to run games in 8K, if they could make the resolution shift dynamically across the screen instead of using four "zones" then it would look pretty natural.


----------



## CallsignVega

Quote:


> Originally Posted by *GoLDii3*
> 
> LOL at IPS and 1000 cd/m2, Enjoy your BLB like you have never enjoyed it before.


This monitor has FALD, so should help tremendously with most IPS problems.


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> This monitor has FALD, so should help tremendously with most IPS problems.


Won't help with IPS glow or static contrast, but I am curious how FALD with 384 dimming zones looks on a 27".


----------



## PCM2

Quote:


> Originally Posted by *boredgunner*
> 
> Won't help with IPS glow or static contrast, but I am curious how FALD with 384 dimming zones looks on a 27".


It will help tremendously with both issues, at least for HDR content where the zones will be controlled. Some dimming zones can dim to extremely low levels whilst other zones are extremely bright. Hence the contrast ratio is raised significantly (it is still a static contrast ratio as this occurs at a single moment in time). IPS glow is also significantly reduced at lower luminances, which is exactly what the dimming zones will be doing when dark content is being displayed. It is like having some parts of the monitor set to '0' luminance and some set to '100' (and in this case '100' is several times higher than most LCD monitors will manage).


----------



## mtcn77

Quote:


> Originally Posted by *boredgunner*
> 
> Uh, where was I ever rooting for the X34?


The brotherhood cannot afford _ignorant mistakes..._ Sorry.
Quote:


> Originally Posted by *PCM2*
> 
> It will help tremendously with both issues, at least for HDR content where the zones will be controlled. Some dimming zones can dim to extremely low levels whilst other zones are extremely bright. Hence the contrast ratio is raised significantly (it is still a static contrast ratio as this occurs at a single moment in time). IPS glow is also significantly reduced at lower luminances, which is exactly what the dimming zones will be doing when dark content is being displayed. It is like having some parts of the monitor set to '0' luminance and some set to '100' (and in this case '100' is several times higher than most LCD monitors will manage).


Colours pop a great deal with dynamic lighting.


----------



## boredgunner

Quote:


> Originally Posted by *PCM2*
> 
> It will help tremendously with both issues, at least for HDR content where the zones will be controlled. Some dimming zones can dim to extremely low levels whilst other zones are extremely bright. Hence the contrast ratio is raised significantly (it is still a static contrast ratio as this occurs at a single moment in time). IPS glow is also significantly reduced at lower luminances, which is exactly what the dimming zones will be doing when dark content is being displayed. It is like having some parts of the monitor set to '0' luminance and some set to '100' (and in this case '100' is several times higher than most LCD monitors will manage).


The haloing effect of dimming zones on larger screens at least often makes it rather pointless (and then when turning it off you are left with the same crappy glowy 1000:1 IPS image). But the zones should be smaller on a 27" so I am very curious as to how it looks. As for HDR content, pretty much a moot point since to my knowledge only one PC game supports HDR (and very few console games) and it's something that's catching on rather slowly.


----------



## mtcn77

Quote:


> Originally Posted by *boredgunner*
> 
> The haloing effect of dimming zones on larger screens at least often makes it rather pointless (and then when turning it off you are left with the same crappy glowy 1000:1 IPS image). But the zones should be smaller on a 27" so I am very curious as to how it looks. As for HDR content, pretty much a moot point since to my knowledge only one PC game supports HDR (and very few console games) and it's something that's catching on rather slowly.


The zones are pretty clunky with 147*147 pixel domains.


----------



## KGPrime

Quote:


> Originally Posted by *Malinkadink*
> 
> It has to be a 10 bit panel to be able to actually produce over a billion colors up from the 16 million an 8 bit panel does. If its 8 bit + FRC dithering then on that principal alone this display isn't worth paying the exuberant price that it will surely go for.


Well, there have been 2k+ dollar Flanders Scientific Field Monitors that use 8bit + frc. They wrote an article on it awhile back, which i can not find now, about how they would refer to 8bit + as true 10 bit going forward because the difference was negligible if done right. Basically to say they were confident enough in the algorithms used anymore that it was a non issue to them. And this was maybe 5 years back.

No, really on paper there's really no reason this thing shouldn't cost 2k. On-paper it would be one of the best lcd monitors ever made, and certainly THE best consumer grade lcd PC monitor ever made. The *real* bummer of it all is it likely won't have a programmable LUT which would certainly put it straight into in the pro grade class and defacto Hall of Fame of consumer Lcd monitors by far.

But again, the reality is. It's Asus/Acer with AUO panels. So ding some coin just for for that.







There should also be a 5 year lick my nuts backlight parts and labor warranty by default, just because of that.









Things it at least better have. Which i would assume it will. An SRBG emulation mode.

Things it should have, that are very doubtful. Programmable LUT. User Updateable Firmware.

Things that might be noticeably absent. ULMB.

Though if they could pull it off with FALD and could match what Samsung is doing with the zoned strobing, it would be sick. Because really that is the logical evolution of strobing. Nv implementation is basically dead if no one would ever chose to use it over Gsync, which seems to be the general consensus and they don't seem to really be pushing it over gsync. It wasn't even mentioned. I would personally choose the better clarity over a smoothness upgrade when GPU capable to push the framerate required of it. I would certainly be running some games in a 1440p window on this thing for a few years. I don't mind that myself .

Besides the ever so looming reality that it's Asus or Acer and an AUO panel. And there not being some glaringly terrible issues with it the only reason i think i could bring myself to NOT buy this monitor







is if it has a piss warranty. 2-3 years considering the cost and the companies would be a joke. More so Asus. And to have to purchase an extended warranty is a slap in the face to the consumer that the company does not stand behind it's very expensive high end product. Eizo does. If you want appear to go to go toe to toe with the best then you better be able to back your products up.


----------



## CallsignVega

Quote:


> Originally Posted by *boredgunner*
> 
> Won't help with IPS glow or static contrast, but I am curious how FALD with 384 dimming zones looks on a 27".


FALD helps tremendously with IPS glow, contrast and back-light bleed. All of IPS largest problems. Edge lit back-lights are IPS nemesis.

Quote:


> Originally Posted by *mtcn77*
> 
> The zones are pretty clunky with 147*147 pixel domains.


But also considering it's a dense 163 PPI screen, I'm expecting it to look pretty darn good.


----------



## mtcn77

Quote:


> Originally Posted by *CallsignVega*
> 
> FALD helps tremendously with IPS glow, contrast and back-light bleed. All of IPS largest problems. Edge lit back-lights are IPS nemesis.
> But also considering it's a dense 163 PPI screen, I'm expecting it to look pretty darn good.


I still cannot calculate the unit domain area. Let me try it again...
"1.0577x1.0577cm"? Pretty good.


----------



## Egzi

If I use this 4k monitor at a 2k ress setting. Will it look better then on a 2k native monitor running on 2k?

Someone told me yes. But find it hard to believe?

Someone know?


----------



## mtcn77

Quote:


> Originally Posted by *Egzi*
> 
> If I use this 4k monitor at a 2k ress setting. Will it look better then on a 2k native monitor running on 2k?
> 
> Someone told me yes. But find it hard to believe?
> 
> Someone know?


Text will look finer *written on a 323ppi screen. You will reach the same fidelity without zooming so big. You will forget about antialiasing. I literally first saw what doom was all about looking at still images via this visual lens, the TN just doesn't show as many colour tones.


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> FALD helps tremendously with IPS glow, contrast and back-light bleed. All of IPS largest problems. Edge lit back-lights are IPS nemesis.
> But also considering it's a dense 163 PPI screen, I'm expecting it to look pretty darn good.


It helps with dynamic contrast (that is the measurement with FALD on) but when FALD is off then you're back to a regular IPS. Are you going to buy one? I'd love to see what you think of it.


----------



## Egzi

Quote:


> Originally Posted by *mtcn77*
> 
> Text will look finer *written on a 323ppi screen. You will reach the same fidelity without zooming so big. You will forget about antialiasing. I literally first saw what doom was all about looking at still images via this visual lens, the TN just doesn't show as many colour tones.


At what ress? 2k on a 4k monitor?


----------



## mtcn77

Quote:


> Originally Posted by *Egzi*
> 
> At what ress? 2k on a 4k monitor?


You would have lost 1 LOD worth of detail(2xAF), but it would be parsed in the vertex to pixel subtleties better. Essentially a shortsighted person's dream come true.


----------



## Egzi

Quote:


> Originally Posted by *mtcn77*
> 
> You would have lost 1 LOD worth of detail(2xAF), but it would be parsed in the vertex to pixel subtleties better. Essentially a shortsighted person's dream come true.


Hehe, did not get that answer. Please help a noob like me.
So would it look better to scale to 2k on a 4k monitor like this? Or better on a native 2k monitor. When gaming.

Since I would not play newer games in 4k, would not want to spend so much money on new hardware.


----------



## mtcn77

Quote:


> Originally Posted by *Egzi*
> 
> Hehe, did not get that answer. Please help a noob like me.
> So would it look better to scale to 2k on a 4k monitor like this? Or better on a native 2k monitor. When gaming.
> 
> Since I would not play newer games in 4k, would *not want to spend so much money on new hardware.*


Display for free? That is the most expensive component, afaik. You would have lowered your texture fidelity while on the other hand improved geometric interpolation of the image by the same amount. [email protected] is 1/2AF(a.k.a. -1LOD) & 2xAA.


----------



## Egzi

Quote:


> Originally Posted by *mtcn77*
> 
> Display for free? That is the most expensive component, afaik. You would have lowered your texture fidelity while on the other hand improved geometric interpolation of the image by the same amount. [email protected] is 1/2AF(a.k.a. -1LOD) & 2xAA.


So what do you think? Should I go for the [email protected]?


----------



## mtcn77

Quote:


> Originally Posted by *Egzi*
> 
> So what do you think? Should I go for the [email protected]?


I'd go 8K since 4xAA cancels out helper pixel undersampling(pervading artifact of unfiltered 3D, the geometry staircase effect). It is literally a dedicated antialiasing solution free of the performance penalty of 2xAA if you take lower textures kindly and can tally the cost of the premium. Also, it might probably have less interface lag.


----------



## mtcn77

Technically, upscaling is difficult. You will fall below the default resolution in texture quality. If you are using AMD at the settings TFQ=P, Surface Format Optimisation=On, 2xAF(not through the driver) I think you can run neutral anisotropic filtering(0xAF) with a constant -0.65 LOD defect when [email protected] upscaled. Apart from the LOD bias("+0.65" how it is actually counted) you'll have pretty good quality for the performance you experience it with.


----------



## ssateneth

give me 30"-32", 1440p or 1600p, 120hz+, non-TN and I'm happy. 27" is too small. I'm used to dell 30" 2560x1600 but also 'need' 120hz+ too.


----------



## CallsignVega

Quote:


> Originally Posted by *boredgunner*
> 
> It helps with dynamic contrast (that is the measurement with FALD on) but when FALD is off then you're back to a regular IPS. Are you going to buy one? I'd love to see what you think of it.


As far as I know FALD is on all the time on this display. Unless you read somewhere it's HDR only? I'll definitely be buying one. Gonna be a long wait though. May have to buy that Dell 8K to hold me over!


----------



## PCM2

Quote:


> Originally Posted by *CallsignVega*
> 
> As far as I know FALD is on all the time on this display. Unless you read somewhere it's HDR only? I'll definitely be buying one. Gonna be a long wait though. May have to buy that Dell 8K to hold me over!


I hope the dimming zones work all the time, but the way it was being sold as a low-latency feature and integral part of "G-SYNC HDR" makes that unclear.


----------



## Egzi

Quote:


> Originally Posted by *mtcn77*
> 
> Technically, upscaling is difficult. You will fall below the default resolution in texture quality. If you are using AMD at the settings TFQ=P, Surface Format Optimisation=On, 2xAF(not through the driver) I think you can run neutral anisotropic filtering(0xAF) with a constant -0.65 LOD defect when [email protected] upscaled. Apart from the LOD bias("+0.65" how it is actually counted) you'll have pretty good quality for the performance you experience it with.


Think I will pass on the 4k.
Quote:


> Originally Posted by *CallsignVega*
> 
> As far as I know FALD is on all the time on this display. Unless you read somewhere it's HDR only? I'll definitely be buying one. Gonna be a long wait though. May have to buy that Dell 8K to hold me over!


What do you think the chances are for a Oled Monitor with those specs coming out?


----------



## rvectors

Pish posh to all naysayers, with this 'amazing' pace of technological development in the PC monitor market, ASUS are warranted in charging us a mountainous premium!

It's ironic that for years we've had to put up with substandard LCDs, now that it might include what you would expect (not the Hz speed) to make the tech work, we will have to fork out Pro level prices. The $1199 might've been pulled if ACER has telegraphed a price, or to wait until it does but lets get this straight, $2000 is a laughable figure in a standard consumer level monitor.

To add some perspective. In 2014, when 4k was still rare, dell announced the 5k UP2715K at $2499, on actual release that was $2000. I suspect what will happen is that ACER and ASUS will count on low competition and mutually (in a legal sense) inflate prices.


----------



## ToTheSun!

Quote:


> Originally Posted by *rvectors*
> 
> To add some perspective. In 2014, when 4k was still rare, dell announced the 5k UP2715K at $2499, on actual release that was $2000.


And just a year later, it could be bought for $1000 in certain stores.


----------



## Nautilus

Dam. Even Titan X Pascal SLI can't put up with this.


----------



## st0necold

yes it can.


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> I actually thought $1,200 seemed a bit low. $2,000 seems a bit high, I was thinking more around $1700.


lets just say $1,499, throw in a handy from the local tug'n rub and call it a day


----------



## boredgunner

Since it's scheduled for Q3, hopefully competition is announced before then. If by then there is no VA counterpart announced, I might buy one since it'll still be better than what I currently have.

I hate how ULMB is never mentioned in these press releases (even for monitors that do have it). I wonder if this one has it? Also, 4k @ 144 Hz means it is using DisplayPort DSC. I wonder how it looks.


----------



## i7monkey

It's so sad that it's taken this long for display tech to progress. Even sadder that you need to invest $2k+ in soon-to-be crippled and obsolete Nvidia GPUs just to be able to run it. The saddest is having slim pickings in a monopolistic GPU market where the junk is overpriced and AMD can't compete.

There's a giant barrier-to-entry with this monitor and being able to game on it. It's a headache I want no part of until the prices in this stupid industry become reasonable.


----------



## jezzer

Quote:


> Originally Posted by *i7monkey*
> 
> It's so sad that it's taken this long for display tech to progress. Even sadder that you need to invest $2k+ in soon-to-be crippled and obsolete Nvidia GPUs just to be able to run it. The saddest is having slim pickings in a monopolistic GPU market where the junk is overpriced and AMD can't compete.
> 
> There's a giant barrier-to-entry with this monitor and being able to game on it. It's a headache I want no part of until the prices in this stupid industry become reasonable.


The saddest part is they are reinventing the wheel and put a premium price on it. Its not that they had to invent 144hz, gsync, 4k and HDR. They just put it together because in the past there was no GPU power to really drive it. Now there is and now they put the tech together and be like look at this new tech shizzle.

Is it overpriced? Yes. Is it worth it? No. Wil people buy it? Yes.


----------



## CallsignVega

Well remember to put all that together you need the hub that controls everything. The DP 1.4 G-Sync chip, which will only come out later this year. Some significant cost went into developing that silicon.


----------



## boredgunner

Quote:


> Originally Posted by *i7monkey*
> 
> It's so sad that it's taken this long for display tech to progress. Even sadder that you need to invest $2k+ in soon-to-be crippled and obsolete Nvidia GPUs just to be able to run it. The saddest is having slim pickings in a monopolistic GPU market where the junk is overpriced and AMD can't compete.
> 
> There's a giant barrier-to-entry with this monitor and being able to game on it. It's a headache I want no part of until the prices in this stupid industry become reasonable.


There will be FreeSync equivalents, and there will be Vega before this monitor releases. A GTX 1080 Ti will be able to run 4k fairly well as long as you lower a few not-so-important settings. Then of course there will be next gen GPUs, and hopefully even better 4k HDR monitors by then.


----------



## RobotDevil666

This is exactly what I wanted ...... a year ago







now after using 34 inch ultrawide I can't come back to 27 even if it's in glorious 4k 144Hz come on ASUS 32 or 40 inch please !!


----------



## Avant Garde

So this year's CES 2017 star of PC monitors is this Asus model??? No other new monitors? Well, I guess PC monitor market will continue to be depressing...


----------



## boredgunner

Quote:


> Originally Posted by *Avant Garde*
> 
> So this year's CES 2017 star of PC monitors is this Asus model??? No other new monitors? Well, I guess PC monitor market will continue to be depressing...


On the one hand it is 4k high refresh rate, HDR, variable refresh rate, and FALD. On the other hand it is still LCD, and IPS at that. An improvement, but not the biggest improvement we could've realistically gotten.


----------



## ZealotKi11er

Quote:


> Originally Posted by *boredgunner*
> 
> On the one hand it is 4k high refresh rate, HDR, variable refresh rate, and FALD. On the other hand it is still LCD, and IPS at that. An improvement, but not the biggest improvement we could've realistically gotten.


They do have a HDR QLED screen but its not a gaming monitor.


----------



## wizardbro

Quote:


> Originally Posted by *boredgunner*
> 
> On the one hand it is 4k high refresh rate, HDR, variable refresh rate, and FALD. On the other hand it is still LCD, and IPS at that. An improvement, but not the biggest improvement we could've realistically gotten.


If this was VA with glossy screen coating


----------



## mtcn77

Quote:


> Originally Posted by *ZealotKi11er*
> 
> They do have a HDR *QLED* screen but its not a gaming monitor.


Oh, I was suspecting of seeing hallucinations when I read it as 'OLED' and thought enough internet for a day!


----------



## ZealotKi11er

Quote:


> Originally Posted by *mtcn77*
> 
> Oh, I was suspecting of seeing hallucinations when I read it as 'OLED' and thought enough internet for a day!


That is the idea lol. They rebranded Quantum Dot LED to QLED to think people is some kind of OLED technology.


----------



## mtcn77

Quote:


> Originally Posted by *ZealotKi11er*
> 
> That is the idea lol. They rebranded Quantum Dot LED to QLED to think people is some kind of OLED technology.


Not actually so! _Though, it sounds serendipitous,_ so I will roll with it!







Quote:


> The company also confirmed that the QD materials on the QLED TV will not be self-emitting light, as some speculated.


[The truth strikes back!]


----------



## DADDYDC650

I want one of those new OLED's that can do 800 nits please!


----------



## mtcn77

Quote:


> Originally Posted by *DADDYDC650*
> 
> I want one of those new *OLED*'s that can do *800 nits* please!


Can you spell: "I-M-A-G-E R-E-T-E-N-T-I-O-N"?


----------



## LunaTiC123

wow, hurray for samsung marketing I guess? wasn't QLED supposed to be the better alternative to OLED that's also self emissive ? something tells me we're gonna be stuck with LCD's for a loooong time sadly, they always find something new to keep LCD's alive


----------



## mtcn77

Quote:


> Originally Posted by *LunaTiC123*
> 
> wow, hurray for samsung marketing I guess? wasn't QLED supposed to be the better alternative for OLED that's also self emissive ? something tells me we're gonna be stuck with LCD's for a loooong time sadly, they always find something new to keep LCD's alive


Not as long as the 'salt' retention period.
Quote:


> OLED's main problem has been high costs and long playing time resulting in images burnt in on to the screen.


[Source]


----------



## ahnafakeef

Quote:


> Originally Posted by *Avant Garde*
> 
> So this year's CES 2017 star of PC monitors is this Asus model??? No other new monitors? Well, I guess PC monitor market will continue to be depressing...


I get that there might not have been enough (or any) monitor(s) that appeals to all of us, but I certainly wouldn't call it depressing. ASUS has its 4K/144Hz panel ready, and Dell has its 8K/60Hz panel ready. And although these are the only ones that come with these specs right now, we can be almost certain that competing manufacturers will deliver their own iterations within this year, hopefully at lower prices and will force the decrease of the prices of the ASUS and the Dell as well.

Personally, I'm very much looking forward to owning an 8K screen. Hopefully we'll get them at much lower prices by the end of the year as there are more models announced/released in the coming months.

So yeah, exciting times ahead. At least for me.


----------



## ZealotKi11er

Quote:


> Originally Posted by *DADDYDC650*
> 
> I want one of those new OLED's that can do 800 nits please!


Nothing can do 800 nits unless it's a smartphone. There is not enough power.


----------



## boredgunner

Quote:


> Originally Posted by *LunaTiC123*
> 
> wow, hurray for samsung marketing I guess? wasn't QLED supposed to be the better alternative to OLED that's also self emissive ? something tells me we're gonna be stuck with LCD's for a loooong time sadly, they always find something new to keep LCD's alive


Yup, and the marketing works. I've had people tell me their TV isn't LCD, it is LED.









What's holding OLED back right now is cost, because LG is the only big company pursuing the tech. I've read that Samsung was set to resume OLED TV production in 2018, not sure how much truth was behind that though. As long as LG is the only one pushing for it then it will sadly remain niche.


----------



## ZealotKi11er

Quote:


> Originally Posted by *boredgunner*
> 
> Yup, and the marketing works. I've had people tell me their TV isn't LCD, it is LED.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's holding OLED back right now is cost, because LG is the only big company pursuing the tech. I've read that Samsung was set to resume OLED TV production in 2018, not sure how much truth was behind that though. As long as LG is the only one pushing for it then it will sadly remain niche.


Any reason why Samsung is not pushing OLED. I would not mind a QLED screen since that is still infinitely better than what we have currently in PC but I feel like OLED is the future and we are just prolonging it with LCD tech.


----------



## mtcn77

Quote:


> Originally Posted by *boredgunner*
> 
> Yup, and the marketing works. I've had people tell me their TV isn't LCD, it is LED.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's holding OLED back right now is cost, because LG is the only big company pursuing the tech. I've read that Samsung was set to resume OLED TV production in 2018, not sure how much truth was behind that though. As long as LG is the only one pushing for it then it will sadly remain niche.


Wait, Samsung has the power to make OLED mainstream and yet, they have been reserving it to themselves for no reason at all?


----------



## boredgunner

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Any reason why Samsung is not pushing OLED. I would not mind a QLED screen since that is still infinitely better than what we have currently in PC but I feel like OLED is the future and we are just prolonging it with LCD tech.


Quote:


> Originally Posted by *mtcn77*
> 
> Wait, Samsung has the power to make OLED mainstream and yet, they have been hiding it to themselves for no reason at all?


They were developing OLED TVs years ago and released some prototypes, then stopped (and they're not alone in that). We can only guess their reasons but the obvious are cost and yields. I suppose they've decided that it hasn't been worth investing in the R&D required to produce a lineup like LG. Remember LG's new $9 billion factory for OLED production? It is no small decision.


----------



## mtcn77

Quote:


> Originally Posted by *boredgunner*
> 
> They were developing OLED TVs years ago and released some prototypes, then stopped (and they're not alone in that). We can only guess their reasons but the obvious are cost and yields. I suppose they've decided that it hasn't been worth investing in the R&D required to produce a lineup like LG. Remember LG's new $9 billion factory for OLED production? It is no small decision.


They are playing tag-team "brother-in-law"! They wouldn't be able to sell enough if each didn't hold a distinction. I'm sure they exchange ideas freely over dinner.


----------



## DADDYDC650

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nothing can do 800 nits unless it's a smartphone. There is not enough power.


Not according to reports coming out of CES. Newest OLED panels can do 800 nits.
https://www.channelnews.com.au/ces-2017-panasonic-announces-new-oled-tv-and-4k-blu-ray-players/


----------



## boredgunner

Quote:


> Originally Posted by *DADDYDC650*
> 
> Not according to reports coming out of CES. Newest OLED panels can do 800 nits.
> https://www.channelnews.com.au/ces-2017-panasonic-announces-new-oled-tv-and-4k-blu-ray-players/


Must be peak brightness for a small window. Current LG models (C6 and E6) can do 650 cd/m2. Plenty for HDR use although those brightness levels won't look optimal due to lack of color volume.


----------



## spinFX

Quote:


> Originally Posted by *Seyumi*
> 
> Would have bought this if it wasn't 27". That's just too small for 4k. Those 34" Ultrawides with lower resolutions are more appealing than this. Don't think I can go from 40" to 27" even if I jump from 60hz to 144hz. If this was at least 32" I would have bought on day 1.


Yep. 24" = 1080p perfection, 27" = 1440p perfection, 32+" = 4K perfection








I will move to 32" [email protected] (*drools*) when I can buy *one* for less than what it cost to setup my current *24" [email protected] (x3) triple* monitor setup.








Might get a 1440p 144hz 27" to bridge the gap otherwise it'll be about 4 years without purchasing a new monitor, cant have that


----------



## DADDYDC650

Quote:


> Originally Posted by *boredgunner*
> 
> Must be peak brightness for a small window. Current LG models (C6 and E6) can do 650 cd/m2. Plenty for HDR use although those brightness levels won't look optimal due to lack of color volume.


Peak brightness is supposed to be 1000+. http://www.flatpanelshd.com/news.php?subaction=showfull&id=1482983106


----------



## ToTheSun!

Quote:


> Originally Posted by *DADDYDC650*
> 
> Quote:
> 
> 
> 
> Originally Posted by *boredgunner*
> 
> Must be peak brightness for a small window. Current LG models (C6 and E6) can do 650 cd/m2. Plenty for HDR use although those brightness levels won't look optimal due to lack of color volume.
> 
> 
> 
> Peak brightness is supposed to be 1000+. http://www.flatpanelshd.com/news.php?subaction=showfull&id=1482983106
Click to expand...

Open a mostly white browser page and watch that baby tank!


----------



## DunePilot

Quote:


> Originally Posted by *spinFX*
> 
> Yep. 24" = 1080p perfection, 27" = 1440p perfection, 32+" = 4K perfection
> 
> 
> 
> 
> 
> 
> 
> 
> I will move to 32" [email protected] (*drools*) when I can buy *one* for less than what it cost to setup my current *24" [email protected] (x3) triple* monitor setup.
> 
> 
> 
> 
> 
> 
> 
> 
> Might get a 1440p 144hz 27" to bridge the gap otherwise it'll be about 4 years without purchasing a new monitor, cant have that


Also, still running a 1 over 3 Asus VG248QE setup. I will eventually swap out the center monitor for 4k 144hz but only in the 32-36" range. Probably will be waiting for awhile. The single 980Ti and 6800k handles what I have great for now, that monitor along with a new build in probably 2-3 years. Awesome to see the monitor tech getting where it's headed though, can't wait to see what we have by then.


----------



## oxidized

Quote:


> Originally Posted by *spinFX*
> 
> Yep. 24" = 1080p perfection, 27" = 1440p perfection, 32+" = 4K perfection
> 
> 
> 
> 
> 
> 
> 
> 
> I will move to 32" [email protected] (*drools*) when I can buy *one* for less than what it cost to setup my current *24" [email protected] (x3) triple* monitor setup.
> 
> 
> 
> 
> 
> 
> 
> 
> Might get a 1440p 144hz 27" to bridge the gap otherwise it'll be about 4 years without purchasing a new monitor, cant have that


What about PPI ? Much better to have WQHD on a 24", FHD on a 24" is pretty meh, it's already meh on a 23", i can't imagine on a 24", i'd have UHD on a 27" if good ones were cheap


----------



## DADDYDC650

Quote:


> Originally Posted by *ToTheSun!*
> 
> Open a mostly white browser page and watch that baby tank!


Even my KS8000 can't hold 1000 nits.


----------



## mtcn77

Quote:


> Originally Posted by *DADDYDC650*
> 
> Even my KS8000 can't hold 1000 nits.


OLED is pathetically predictable.


----------



## DADDYDC650

Quote:


> Originally Posted by *mtcn77*
> 
> OLED is pathetically predictable.


You mean OLED is amazingly beautiful.


----------



## mtcn77

Quote:


> Originally Posted by *DADDYDC650*
> 
> You mean OLED is amazingly beautiful.


-_"Impressive!"_


----------



## rvectors

http://4k.com/news/nvidia-and-au-optronics-team-up-for-4k-uhd-display-at-144hz-in-asus-and-acer-monitors-18171/

They've posted the old price of $1200 but they put the Acer at between $950-1100.


----------



## -terabyte-

Quote:


> Originally Posted by *rvectors*
> 
> http://4k.com/news/nvidia-and-au-optronics-team-up-for-4k-uhd-display-at-144hz-in-asus-and-acer-monitors-18171/
> 
> They've posted the old price of $1200 but they put the Acer at between $950-1100.


As usual Asus charges more because of "ROG tax". It was the same for PG348Q and X34.


----------



## CallsignVega

I was just switching back and forth between a LCD and my LG OLED playing BF1 a lot yesterday. Of course the OLED just destroys any LCD for picture quality and immersion. As impressive as the specs are on this new Asus, I simply don't think I could drop all the way down to 27". If this display was 30-32" I would have most likely taken the plunge when it releases.


----------



## mtcn77

Quote:


> Originally Posted by *CallsignVega*
> 
> I was just switching back and forth between a LCD and my LG OLED playing BF1 a lot yesterday. Of course the OLED just destroys any LCD for picture quality and immersion. As impressive as the specs are on this new Asus, I simply don't think I could drop all the way down to 27". If this display was 30-32" I would have most likely taken the plunge when it releases.


Come on, even to a guy like me, that is _'cold'!_ 163ppi; the Dell 8K even comes with 280 dpi...


----------



## {core2duo}werd

Quote:


> Originally Posted by *boredgunner*
> 
> Yup, and the marketing works. I've had people tell me their TV isn't LCD, it is LED.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's holding OLED back right now is cost, because LG is the only big company pursuing the tech. I've read that Samsung was set to resume OLED TV production in 2018, not sure how much truth was behind that though. As long as LG is the only one pushing for it then it will sadly remain niche.


http://www.trustedreviews.com/opinions/what-is-qled-the-future-of-tv-tech-explained

Electroluminescent Quantum Dots are what make true QLED from what that article says. What samsung is calling their QLED tv's this year at CES aren't really true QLED because they still use electroluminescent quantum dots.

I'm loving all the new tech coming to displays recently.


----------



## Iching

Quote:


> Originally Posted by *Yvese*
> 
> All I gotta say is if you pay anywhere near $1k for this you're crazy..
> 
> I bought my 65KS8000 for $1079. To pay near that price for a 27" monitor is madness. $600 is where I would draw the line, and even then that would be for 32" not 27.


Quote:


> Originally Posted by *-terabyte-*
> 
> As usual Asus charges more because of "ROG tax". It was the same for PG348Q and X34.


Just because you're driving a Fiat it doesn't mean others can't afford a Benz.


----------



## boredgunner

Quote:


> Originally Posted by *Iching*
> 
> Just because you're driving a Fiat it doesn't mean others can't afford a Benz.


terabyte has a point. With ASUS you're usually paying more for no gamma control and no benefits.


----------



## bazh

Quote:


> Originally Posted by *{core2duo}werd*
> 
> http://www.trustedreviews.com/opinions/what-is-qled-the-future-of-tv-tech-explained
> 
> Electroluminescent Quantum Dots are what make true QLED from what that article says. What samsung is calling their QLED tv's this year at CES aren't really true QLED because they still use electroluminescent quantum dots.
> 
> I'm loving all the new tech coming to displays recently.


So the article spent time to explain what "QLED" are, which worth noticing that it's a term marketed by Samsung, then go to the conclusion that Samsung's QLED offers are not really what they describe above.









Basically they just described a technology that Sony brought to the market 4 years ago under the name Triluminos TV and somehow still manage to market it as a pioneer and write a whole new article about it.

http://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots


----------



## wizardbro

Wait, acer variants get more OSD options?


----------



## mtcn77

Quote:


> Originally Posted by *bazh*
> 
> So the article spent time to explain what "QLED" are, which worth noticing that it's a term marketed by Samsung, then go to the conclusion that Samsung's QLED offers are not really what they describe above.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Basically they just described a technology that Sony brought to the market 4 years ago under the name Triluminos TV and somehow still manage to market it as a pioneer and write a whole new article about it.
> 
> http://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots


Microsoft introduced 'Tablet' in 2000, Oracle had the Network Computer in 1995, Apple did it _in 2010._ And I'm using the Google & Asus partnership's product from 2013. Your point?


----------



## boredgunner

Quote:


> Originally Posted by *wizardbro*
> 
> Wait, acer variants get more OSD options?


ASUS monitors (yes, the higher end ROG Swifts except maybe the PG279Q) usually lack gamma control in the OSD while Acer always has it.


----------



## rvectors

It seems the price is closer to the $2000 mark

http://www.kitguru.net/peripherals/monitors/matthew-wilson/two-4k144hz-monitors-coming-q2-2017-first-will-cost-2000/

Pretty outrageous, it's 27 inch monitor, and as far as I'm aware, does not come with the same colour space coverage you'd expect from pro level monitors. Maybe Nvidia are asking a huge premium for their new controller.

Is it my imagination or my crappy monitor (although the one of the left doesn't exhibit it), at the 1 minute mark...





 (btw he mentions $1500 price point)

...the new panel is showing a faint but clear white band on each edge where there is bright contrast.. actually even around the transition between dark and bright objects. Looks like that halo effect already mentioned here?

This year I couldn't find many videos reporting live from CES, or at least regarding the monitors I was interested in. Seemed like nobody bothered to turn up. There was one Swedish video, the one above and now the official one from ASUS






I may have missed it but I didn't see an announcement for a 2017 2K version of the swift, if so is this to prevent cannibalising sales of the 4k?


----------



## Seyumi

In regard to HDR. Someone correct me if I'm wrong. Here's all the TV marketing hypes I can remember:

1. HDMI
2. HD
3. 3D
4. 1 BILLION hertz
5. 10 BILLION contrast ratio
6. Smart TV
7. Thin Bezel
8. 4K
9. Curved
10. HDR

11. Probably VR next

I think the only 2 things that weren't just marketing hype was HD & 4K and even that is questionable since many of these types of TV's launched several years before devices, providers, consoles, ect. caught up to those resolutions. HDR is now the next hyped gimmick. I think it's already been proven that there are no true HDR TV's or monitors right now. They all lack something but they're allowed to slap the "HDR" sticker on it anyway since it's just "compatible."


----------



## iARDAs

Yep the price is $2000

tftcentral just tweeted this.

So this is no longer an option for me.

Sorry but gaming monitors are extremely overpriced no matter how awesome they are. This is getting ridicilous.


----------



## rvectors

Quote:


> Originally Posted by *iARDAs*
> 
> Yep the price is $2000
> 
> tftcentral just tweeted this.
> 
> So this is no longer an option for me.
> 
> Sorry but gaming monitors are extremely overpriced no matter how awesome they are. This is getting ridicilous.


Ah wait that's a recommended retail price. These are always hugely over inflated, could be closer to $1600 instead


----------



## ILoveHighDPI

Quote:


> Originally Posted by *Seyumi*
> 
> In regard to HDR. Someone correct me if I'm wrong. Here's all the TV marketing hypes I can remember:
> 
> 1. HDMI
> 2. HD
> 3. 3D
> 4. 1 BILLION hertz
> 5. 10 BILLION contrast ratio
> 6. Smart TV
> 7. Thin Bezel
> 8. 4K
> 9. Curved
> 10. HDR
> 
> 11. Probably VR next
> 
> I think the only 2 things that weren't just marketing hype was HD & 4K and even that is questionable since many of these types of TV's launched several years before devices, providers, consoles, ect. caught up to those resolutions. HDR is now the next hyped gimmick. I think it's already been proven that there are no true HDR TV's or monitors right now. They all lack something but they're allowed to slap the "HDR" sticker on it anyway since it's just "compatible."


On the subject of "useful hype" terminology, hopefully HDMI 2.1 actually pulls through and gets implemented as broadly as possible because of the Variable Refresh Rate support.
http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx

It's hard to say how this actually affects display manufacturers, but if "VRR" eventually means lag-free V-sync on any gaming system hooked up to a display with that sticker then that's going to be a very important buzzword.
They also seem to have support for everything up to 10K at 120hz, it sounds like the HDMI consortium actually decided to do some serious future proofing for once.


----------



## iARDAs

Quote:


> Originally Posted by *rvectors*
> 
> Ah wait that's a recommended retail price. These are always hugely over inflated, could be closer to $1600 instead


I was honestly hoping for $1200 max where monitors such as the Acer X34 or the Asus PG348Q would drop to the 700-800$ mark


----------



## rvectors

Quote:


> Originally Posted by *iARDAs*
> 
> I was honestly hoping for $1200 max where monitors such as the Acer X34 or the Asus PG348Q would drop to the 700-800$ mark


It could still be nearer that price, I wouldn't be surprised that this is simply playing games with ACER. It could be that ASUS was not aware that ACER had a monitor ready to be launched, with essentially the same spec, so close their one.

Also if there isn't a 2017 Swift 2k version, they have a gap in their model range, with the leap up to $2000 from the current offerings, being very high. In marketing and sales terms, that doesn't square up.


----------



## iARDAs

Quote:


> Originally Posted by *rvectors*
> 
> It could still be nearer that price, I wouldn't be surprised that this is simply playing games with ACER. It could be that ASUS was not aware that ACER had a monitor ready to be launched, with essentially the same spec, so close their one.


Do we know the price of the Acer one?


----------



## rvectors

Quote:


> Originally Posted by *iARDAs*
> 
> Do we know the price of the Acer one?


I've found no details, so another reason to move the price around. Not good to have another competitor easily see where to undercut.


----------



## DADDYDC650

I'll wait until a 32" or bigger is released and going for $1000 or less. No rush. Tech will be even better by then and there will be more competition.


----------



## mmms

I need this monitor for gaming only not programming and video editing .
Is 27" enough for 4k gaming with good quality with nvidia gpus in the future or should we wait until a 30" or bigger is released with this specifications (144hz , ips , g-sync ) by 2018 or 2019 ?

thanks in advance


----------



## Dhoulmagus

Quote:


> Originally Posted by *mmms*
> 
> I need this monitor for gaming only not programming and video editing .
> Is 27" enough for 4k gaming with good quality with nvidia gpus in the future or should we wait until a 30" or bigger is released with this specifications (144hz , ips , g-sync ) by 2018 or 2019 ?
> 
> thanks in advance


Depends on what you want.. If you're sitting up straight in an ergonomic chair at a desk with your monitor only a couple feet away, 27" is great. 32" is getting into the range where you'll start pushing the monitor towards the back of your desk, 40" is when you'll put your feet up and recline









Why not make cardboard cutouts of your prospective screen sizes and place them over your current screen to see what size suits you.

The generic response to this is question is that 4k is useless at 27" and you want 32 minimum.. I disagree, I'd like to play 4k @ 144hz 27" when the graphics power is there. There is some subtle differences, but for me it's really the desktop real estate and immunity to jagged edges in distant objects in games.


----------



## iARDAs




----------



## boredgunner

Quote:


> Originally Posted by *Seyumi*
> 
> In regard to HDR. Someone correct me if I'm wrong. Here's all the TV marketing hypes I can remember:
> 
> 1. HDMI
> 2. HD
> 3. 3D
> 4. 1 BILLION hertz
> 5. 10 BILLION contrast ratio
> 6. Smart TV
> 7. Thin Bezel
> 8. 4K
> 9. Curved
> 10. HDR
> 
> 11. Probably VR next
> 
> I think the only 2 things that weren't just marketing hype was HD & 4K and even that is questionable since many of these types of TV's launched several years before devices, providers, consoles, ect. caught up to those resolutions. HDR is now the next hyped gimmick. I think it's already been proven that there are no true HDR TV's or monitors right now. They all lack something but they're allowed to slap the "HDR" sticker on it anyway since it's just "compatible."


VR? No, those are headsets.

Everything there isn't necessarily marketing hype, but has at some point been used as marketing hype. It is true that HDR isn't being used to its full potential yet; not enough content, displays not quite good enough (and HDR is too young), and they need to be using newer interfaces like DisplayPort 1.4 and HDMI 2.1 just to deliver 4k 60 Hz with 10-bit color and HDR. Lots of HDR content is also created only using sRGB color space, which isn't what either HDR-10 or Dolby Vision call for.


----------



## Robilar

Quote:


> Originally Posted by *Serious_Don*
> 
> Depends on what you want.. If you're sitting up straight in an ergonomic chair at a desk with your monitor only a couple feet away, 27" is great. 32" is getting into the range where you'll start pushing the monitor towards the back of your desk, 40" is when you'll put your feet up and recline
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why not make cardboard cutouts of your prospective screen sizes and place them over your current screen to see what size suits you.
> 
> The generic response to this is question is that 4k is useless at 27" and you want 32 minimum.. I disagree, I'd like to play 4k @ 144hz 27" when the graphics power is there. There is some subtle differences, but for me it's really the desktop real estate and immunity to jagged edges in distant objects in games.


The exception to your above comments is a defined curvature in the monitor.







An aggressively curved 35" is a lot more fun that a standard flat 27". I went from the ROG Swift to the G35 and no way am I going back.


----------



## boredgunner

Quote:


> Originally Posted by *Serious_Don*
> 
> The generic response to this is question is that 4k is useless at 27" and you want 32 minimum.. I disagree.


Yeah, that is a clueless response people love to dish out for some reason.


----------



## mmms

Depends on what you want.. If you're sitting up straight in an ergonomic chair at a desk with your monitor only a couple feet away, 27" is great. 32" is getting into the range where you'll start pushing the monitor towards the back of your desk, 40" is when you'll put your feet up and recline









Why not make cardboard cutouts of your prospective screen sizes and place them over your current screen to see what size suits you.

The generic response to this is question is that 4k is useless at 27" and you want 32 minimum.. I disagree, I'd like to play 4k @ 144hz 27" when the graphics power is there. There is some subtle differences, but for me it's really the desktop real estate and immunity to jagged edges in distant objects in games.

The exception to your above comments is a defined curvature in the monitor.







An aggressively curved 35" is a lot more fun that a standard flat 27". I went from the ROG Swift to the G35 and no way am I going back.

Yeah, that is a clueless response people love to dish out for some reason.

can we see the diffrence between 4k on 27" and 1440p on 27" for gaming ?
i don't care about big size , i care about good quality and see the diffrence and Do I need to upgrade to this monitor or not especially for gaming not programming in the future ?


----------



## mmms

Quote:


> Originally Posted by *Serious_Don*
> 
> Depends on what you want.. If you're sitting up straight in an ergonomic chair at a desk with your monitor only a couple feet away, 27" is great. 32" is getting into the range where you'll start pushing the monitor towards the back of your desk, 40" is when you'll put your feet up and recline
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why not make cardboard cutouts of your prospective screen sizes and place them over your current screen to see what size suits you.
> 
> The generic response to this is question is that 4k is useless at 27" and you want 32 minimum.. I disagree, I'd like to play 4k @ 144hz 27" when the graphics power is there. There is some subtle differences, but for me it's really the desktop real estate and immunity to jagged edges in distant objects in games.


Quote:


> Originally Posted by *Robilar*
> 
> The exception to your above comments is a defined curvature in the monitor.
> 
> 
> 
> 
> 
> 
> 
> An aggressively curved 35" is a lot more fun that a standard flat 27". I went from the ROG Swift to the G35 and no way am I going back.


Quote:


> Originally Posted by *boredgunner*
> 
> Yeah, that is a clueless response people love to dish out for some reason.


can we see the diffrence between 4k on 27" and 1440p on 27" for gaming ?
i don't care about big size , i care about good quality and see the diffrence and Do I need to upgrade to this monitor or not especially for gaming not programming in the future ?


----------



## boredgunner

Quote:


> Originally Posted by *mmms*
> 
> can we see the diffrence between 4k on 27" and 1440p on 27" for gaming ?
> i don't care about big size , i care about good quality and see the diffrence and Do I need to upgrade to this monitor or not especially for gaming not programming in the future ?


As if gaming is some special task that modifies the fundamental behavior of a monitor? But yes, 4k will look more crisp and clear and resolute than 1440p on a 27", and aliasing will be reduced considerably.


----------



## chrisnyc75

Quote:


> Originally Posted by *Seyumi*
> 
> In regard to HDR. Someone correct me if I'm wrong. Here's all the TV marketing hypes I can remember:
> 
> 1. HDMI
> 2. HD
> 3. 3D
> 4. 1 BILLION hertz
> 5. 10 BILLION contrast ratio
> 6. Smart TV
> 7. Thin Bezel
> 8. 4K
> 9. Curved
> 10. HDR
> 
> 11. Probably VR next
> 
> I think the only 2 things that weren't just marketing hype was HD & 4K and even that is questionable since many of these types of TV's launched several years before devices, providers, consoles, ect. caught up to those resolutions. HDR is now the next hyped gimmick. I think it's already been proven that there are no true HDR TV's or monitors right now. They all lack something but they're allowed to slap the "HDR" sticker on it anyway since it's just "compatible."


3D wasn't just marketing hype, the technology was/is there and it is a definite upgrade. The public just never got onboard. And in this age where Netflix is about as popular as any TV network, the "Smart TV" Netflix app comes in awfully handy.

HDMI, curved, dynamic contrast, etc I'll give you.

I'm holding off on judging HDR/Dolby Vision until I see it for myself. I'm hoping it's as big as they say it is, but they are prone to exaggeration.


----------



## EniGma1987

Quote:


> Originally Posted by *chrisnyc75*
> 
> I'm holding off on judging HDR/Dolby Vision until I see it for myself. I'm hoping it's as big as they say it is, but they are prone to exaggeration.


If you go look at them at a Costco most implementations dont look that impressive. Samsung HDR TVs are majorly over saturated and look like crap. The LG ones look really nice and definitely have an impressive look to them compared to other models, but only in the top end HDR sets. The basic HDR is "meh".


----------



## boredgunner

Quote:


> Originally Posted by *chrisnyc75*
> 
> 3D wasn't just marketing hype, the technology was/is there and it is a definite upgrade. The public just never got onboard. And in this age where Netflix is about as popular as any TV network, the "Smart TV" Netflix app comes in awfully handy.
> 
> HDMI, curved, dynamic contrast, etc I'll give you.
> 
> I'm holding off on judging HDR/Dolby Vision until I see it for myself. I'm hoping it's as big as they say it is, but they are prone to exaggeration.


To take advantage of HDR on an LCD screen, you really need FALD and lots of dimming zones, which this monitor and a few TVs have. Or OLED, although their lack of color volume should make really bright content look somewhat washed out.

Curved for a TV is rather worthless, but it is helpful for VA monitors. Dynamic contrast is a worthless claimed specification every time, although it could come in handy if given by a good reviewer (e.g., measuring the zone contrast ratio on this monitor).


----------



## CallsignVega

Quote:


> Originally Posted by *mmms*
> 
> can we see the diffrence between 4k on 27" and 1440p on 27" for gaming ?
> i don't care about big size , i care about good quality and see the diffrence and Do I need to upgrade to this monitor or not especially for gaming not programming in the future ?


My experience with a 27" high PPI screen was the Dell 5K. Font's and Windows use it did look glorious, but I realized all that PPI was more or less wasted in games. 27" isn't a complete waste with this monitor, but I would have much preferred 32". That would have shown off the 4K better in games + more immersion.


----------



## mmms

Quote:


> Originally Posted by *CallsignVega*
> 
> My experience with a 27" high PPI screen was the Dell 5K. Font's and Windows use it did look glorious, but I realized all that PPI was more or less wasted in games. 27" isn't a complete waste with this monitor, but I would have much preferred 32". That would have shown off the 4K better in games + more immersion.


I think 27" perfect size for 4k gaming in the future and i don't see a noticeable difference between 4k 27 vs 32 espicially with gaming .
the important thing about this case is 144hz+g-sync . that's my opinion .

What's so bad about that? Smaller monitor + higher resolution = better dot pitch = sharper image. The only reason you'd want a 32'' monitor is if you want a really goddamn huge monitor. 27'' is pretty big already.
I think at this point, the demand for sharper image is bigger than the demand for increase in monitor size. It's also a more sensible demand. Because as stated, 27'' is pretty big.


----------



## shhek0

Is 27" 4k screen really small? I mean in the very near future I am looking to upgrade to 4k ( monitor, tv, new pc config) and simply anything above 27 would be huge for me(monitor). However is the real issue simply that you have to scale everything? Are app not scaling good?


----------



## DADDYDC650

Quote:


> Originally Posted by *shhek0*
> 
> Is 27" 4k screen really small? I mean in the very near future I am looking to upgrade to 4k ( monitor, tv, new pc config) and simply anything above 27 would be huge for me(monitor). However is the real issue simply that you have to scale everything? Are app not scaling good?


27" is small for 4K. Not 100 percent sure how WIndows 10 handles scaling in this situation. I'm sure you'll have to enlarge windows/screens.


----------



## Oubadah

..


----------



## DADDYDC650

Quote:


> Originally Posted by *Oubadah*
> 
> High PPI won't be "wasted in games" until it's high enough to overcome vernier acuity and eliminate aliasing, and that's not going to happen for a very, _very_ long time. That is, of course, in the absence of some miracle full scene antialiasing that has absolutely no tradeoffs (not going to happen).
> 
> At this point in time there is no such thing as "too small for X resolution" in PC monitors, assuming perfect scaling. Windows' horrible scaling is an important consideration, but I hear they're working on that (again).


4K on a 27" monitor isn't a big upgrade from 1440p. That's why most folks want at least 32+ inches when running 4K.


----------



## Dhoulmagus

Quote:


> Originally Posted by *shhek0*
> 
> Is 27" 4k screen really small? I mean in the very near future I am looking to upgrade to 4k ( monitor, tv, new pc config) and simply anything above 27 would be huge for me(monitor). However is the real issue simply that you have to scale everything? Are app not scaling good?


Quote:


> Originally Posted by *DADDYDC650*
> 
> 27" is small for 4K. Not 100 percent sure how WIndows 10 handles scaling in this situation. I'm sure you'll have to enlarge windows/screens.


With windows 10 you'll probably want to bump it up to 200-250% DPI scaling to have all of your graphical objects (start menus, tabs, etc) look how you are familiar at 1080P. But it all looks fine. Only gripe I really have is that the stock fonts in Windows are still crap. Ubuntu looks quite nice scaled up for a 4k monitor too.

My experience was on a 28" ASUS 4k monitor. The only "issue" I ever really had was with certain older applications UI not wanting to scale, and obviously older games UI are too small at 4k, you'll want to run them at 1920x1080. Those same apps and games looked horrible on 40" 4k screens as well because you sit further back, so honestly the size isn't too relevant in real use unless you're the type that sits with his nose to a 40" panel.

4k does look better at 27", not really on the desktop but in games like GTA 5 when you're looking off in the distant, it's totally apparent.


----------



## Oubadah

..


----------



## DADDYDC650

Quote:


> Originally Posted by *Oubadah*
> 
> I personally compared a U2711 and an XB280HK, and yes, it was a significant upgrade. I guess it depends on the quality of one's eyesight.


Is 20/20 vision not good enough? 1440p to 4k isn't that big of a deal on a 27" monitor. A lot of folks would agree.


----------



## Oubadah

..


----------



## Dhoulmagus

Quote:


> Originally Posted by *DADDYDC650*
> 
> Is 20/20 vision not good enough? 1440p to 4k isn't that big of a deal on a 27" monitor. A lot of folks would agree.


I have horrendous vision and require full frame glasses and can't wear contacts. I can put a 27" 1440 and 4k monitor side by side and see the difference, see my prior post though, it is use case. I've done the test in a home environment. I actually did it straight across the board with a 27" 1080P 144hz monitor in the mix. 4k is glorious, 1440 is beautiful, 1080P looks like CRAP in 2017









In the past though, the 1440 IPS monitors always looked vastly superior, but with this monitor on the horizon, it's an upgrade hands down.


----------



## DADDYDC650

Quote:


> Originally Posted by *Oubadah*
> 
> 20/20 vision isn't an upper limit. It's *normal* acuity.


LoL! I have great vision. Like I said, the difference isn't night and day. It's there but it's not huge.


----------



## hollowtek

I'm going to wait on a gpu under 300 that can take on this beast. Hopefully the price will 75-85% of $1200 by then.


----------



## Dhoulmagus

Quote:


> Originally Posted by *hollowtek*
> 
> I'm going to wait on a gpu under 300 that can take on this beast. Hopefully the price will 75-85% of $1200 by then.


I feel the same but at this rate I'll be retired before that happens. Mid life crisis purchases incoming!


----------



## DADDYDC650

Quote:


> Originally Posted by *hollowtek*
> 
> I'm going to wait on a gpu under 300 that can take on this beast. Hopefully the price will 75-85% of $1200 by then.


Wise decision. It's great that we're getting all the bells and whistles but at 27", I'd rather have 1440p res for perfect scaling on a 27" monitor and current GPU's would have no issue pushing 100Hz+. Hopefully we'll have more choices soon enough.


----------



## boredgunner

Quote:


> Originally Posted by *Oubadah*
> 
> 20/20 vision isn't an upper limit. It's *normal* acuity.


At this point I'd say it boils down to how much one cares about resolution and anti-aliasing. Most people don't give a damn about aliasing and haven't had that veil lifted for them yet (and good for them).


----------



## Oubadah

..


----------



## ILoveHighDPI

Quote:


> Originally Posted by *boredgunner*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Oubadah*
> 
> 20/20 vision isn't an upper limit. It's *normal* acuity.
> 
> 
> 
> At this point I'd say it boils down to how much one cares about resolution and anti-aliasing. Most people don't give a damn about aliasing and haven't had that veil lifted for them yet (and good for them).
Click to expand...

Funny thing is it's kind of a chicken and egg situation.
The average person doesn't care about something they don't have or are unaware of, but the opposite is equally true that if someone grows up with a higher standard they're most likely going to be upset if that standard changes.
I've had friends in the past who were willfully ignorant of the concept of screen resolution, they stopped me from explaining it to them, their idea of an awesome gaming experience was a big screen TV and a 50" 720p display fit that description just fine.

People who start gaming on 4K are going to be very sensitive to the issue, as time goes on the culture is going to shift more and more toward a focus on resolution, because the benefits are real and people will feel something is wrong without it.
Hopefully the same thing happens with 120hz gaming in the console space sooner than later.


----------



## Robilar

120hz gaming with consoles will require true 120hz tv's with a connector capable of the extra bandwidth first.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Robilar*
> 
> 120hz gaming with consoles will require true 120hz tv's with a connector capable of the extra bandwidth first.


That is why I am worried we might never see 40" 120Hz screens because those are TV panels design for 60Hz.


----------



## mtcn77

You know what guys, Vega is making a great point. At 323 ppi, there are no aliasing, trailing, nor resolution worries left to speak of.


----------



## boredgunner

Quote:


> Originally Posted by *Robilar*
> 
> 120hz gaming with consoles will require true 120hz tv's with a connector capable of the extra bandwidth first.


Good thing we have HDMI 2.1 to look forward to, since the TV industry is intent on not using DisplayPort for whatever reason.


----------



## ILoveHighDPI

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Robilar*
> 
> 120hz gaming with consoles will require true 120hz tv's with a connector capable of the extra bandwidth first.
> 
> 
> 
> That is why I am worried we might never see 40" 120Hz screens because those are TV panels design for 60Hz.
Click to expand...

The interesting thing is that even the PS3 was driving TV's at 120hz, only in 3D at the time, but that should mean that panels with this capability are all over the place, I'm guessing it's mostly up to software developers to implement it.

HDMI 2.1 went out of its way to emphasize 120hz at every resolution and Variable Refresh Rate support, that sounds to me like the 120hz craze has sunk in enough that the home theater manufacturers are taking notice.
In the article on the HDMI website they say they want the connector to be more universal (meaning their intent was to eliminate competition from PC standards), but having this kind of thing built in, if console games start running with unlocked framerates, and displays are capable of 120hz, then at least sometimes you might just coincidentally find both in effect.


----------



## Oubadah

..


----------



## mtcn77

Quote:


> Originally Posted by *Oubadah*
> 
> With my 400PPI (LCD) phone at monitor viewing distance, the shimmer of aliasing is still readily apparent. A lot of people will need significantly more density than 320PPI to eliminate aliasing even at monitor viewing distances.


The staircase effect on the geometric edges, or the shimmery textures? The latter does not refer to AA, it refers to AF quality. Run 3Center Filter Tester and test the variables to see just how approximated your texture sampling is. Mine(and I suppose all the rest of the recent series) have just enough samples for 2x AF with all performance optimisations turned to the max. You got to admit 'zero' performance loss between the AF settings has to come up somewhere...


----------



## Oubadah

..


----------



## mtcn77

Quote:


> Originally Posted by *Oubadah*
> 
> Yes, I'm talking about regular edge aliasing. Obviously at this density you're not counting individual "stairs" (so to speak) but you can still see a disturbance.


You know, Android has a compatibility renderer that upscales the image after requesting the layout at 1/4th scale. I bet you can do the same by 4xMSAA +SMAA rendered at 1/4th scale(960*540p).


----------



## Oubadah

..


----------



## mtcn77

Quote:


> Originally Posted by *Oubadah*
> 
> Why exactly would I do that? I'm not concerned about aliasing in Android because I don't do mobile gaming. We were talking about gaming on PC monitors (specifically how much pixel density it takes to eliminate aliasing with _no trade-offs_), and I was just using the phone as an example because it was a 400PPI display I had on hand.


If I may assume the controls of that discussion, you need 4xMSAA at any resolution to bring its 2*2 helper pixel grid derivative bias to its native vertex sampler resolution of the screen.


----------



## DADDYDC650

Quote:


> Originally Posted by *mtcn77*
> 
> If I may assume the controls of that discussion, you need 4xMSAA at any resolution to bring its 2*2 helper pixel grid derivative bias to its native vertex sampler resolution of the screen.


Pretty much this. I think 4K on a 27" is a bit too much. It's awesome but perhaps a 1440p version would be a couple hundred dollars cheaper viable.


----------



## juano

Quote:


> Originally Posted by *DADDYDC650*
> 
> Pretty much this. I think 4K on a 27" is a bit too much. It's awesome but perhaps a 1440p version would be a couple hundred dollars cheaper viable.


Sure but then you'd lose the option to do 4k Blu-ray or 4k Netflix playback. You could also save a few hundred dollars by leaving out HDR but then you'd just have a PG279Q


----------



## DADDYDC650

Quote:


> Originally Posted by *juano*
> 
> Sure but then you'd lose the option to do 4k Blu-ray or 4k Netflix playback. You could also save a few hundred dollars by leaving out HDR but then you'd just have a PG279Q


Meh, movies aren't so immerse on a 27". The main selling points for me is HDR, 144Hz and 384 zones. 4K is just a bonus.


----------



## pez

4K for games at 27" is pretty great, but for anything else in a Windows environment....useless. I agree with the others that I would love to see this in a 32 inch variant.


----------



## EniGma1987

Quote:


> Originally Posted by *boredgunner*
> 
> Good thing we have HDMI 2.1 to look forward to, since the TV industry is intent on not using DisplayPort for whatever reason.


I asked a couple people who make A/V receivers (Onkyo and Emotiva) and their answer was the same for both. They said it had to do with DRM and content protection that doesn't work through DisplayPort. Now, that doesn't make a lot of sense to me because DisplayPort supports HDCP and 1.4 supports HDCP2.2 even, but there must be something else along with HDCP that isn't supports for both manufacturers to give the same answer on the DRM issue and still not 1 manufacturer ever using a DP connector. I would love to see 1-2 DP connectors put into a receiver but I doubt it will ever happen


----------



## mmms

Quote:


> Originally Posted by *pez*
> 
> 4K for games at 27" is pretty great, but for anything else in a Windows environment....useless. I agree with the others that I would love to see this in a 32 inch variant.


what's is the better option between :
this monitor (asus pg27uq) 27" with (144hz , ips and g-sync/hdr)
or
Acer XB321HK 32 inch ( 60hz , ips and g-sync) ?
I need the best option for gaming for a long time in the future .

and , Do you think asus will release this monitor with 32 inch in the future and we should wait until is released or this model with 27 inch enough for heavy gaming ?


----------



## pez

Quote:


> Originally Posted by *mmms*
> 
> what's is the better option between :
> this monitor (asus pg27uq) 27" with (144hz , ips and g-sync/hdr)
> or
> Acer XB321HK 32 inch ( 60hz , ips and g-sync) ?
> I need the best option for gaming for a long time in the future .
> 
> and , Do you think asus will release this monitor with 32 inch in the future and we should wait until is released or this model with 27 inch enough for heavy gaming ?


I would say it depends on system specs. Honestly, 4K looks good on 27" when you play a game, but if you need to do anything else with the PC functionality wise, it's just a bit of a pain. Windows is still the worst OS for text and GUI scaling when it comes to large resolutions. If you can afford it and you're doing gaming 99% of the time, I'd say it's worth a shot to try this monitor out. Otherwise, you might want to stick with the 32 inch panel. The downside is that it's only 60Hz, but you do get G-sync, which can be a huge benefit for 4K at the moment considering you're going to dip below 60FPS on newer AAA titles unless you're currently running SLI Titan X Ps.


----------



## starliner

Quote:


> Originally Posted by *pez*
> 
> 4K for games at 27" is pretty great, but for anything else in a Windows environment....useless.


I don't think so. I've been using a surface book at work for 2 months or so, and it looks awesome. Granted it is 3:2 (3kx2k) but that isn't that far away from 3840x2160. And it's win10. I doubt win7 can scale very well. But in that is your issue, then time to get with the times


----------



## pez

Quote:


> Originally Posted by *starliner*
> 
> I don't think so. I've been using a surface book at work for 2 months or so, and it looks awesome. Granted it is 3:2 (3kx2k) but that isn't that far away from 3840x2160. And it's win10. I doubt win7 can scale very well. But in that is your issue, then time to get with the times


This is all with Win 10 Pro. Windows does ok with 4K, but if you spend enough time with it in a desktop environment, you will see it start to fall flat on ti's face. This is considering I've used Mac OS and RHEL/Fedora at 4K with a more consistent experience.

Also, 4K is around 33% more pixels than the resolution you are using, so it is quite the difference. I will reiterate -- using 4K at 27 inches is about the equivalent (terrible) experience as using 1440p at 24 or less inches.


----------



## Oubadah

..


----------



## pez

Yeah. I would have loved to keep my 4K 27 inch monitor from before. It was a gorgeous display, but ultimately those are the reasons why I got rid of it. I game 90% of the time when I am on my system, but the other 10% was enough to drive me mad.


----------



## profundido

Quote:


> Originally Posted by *pez*
> 
> I would say it depends on system specs. Honestly, 4K looks good on 27" when you play a game, but if you need to do anything else with the PC functionality wise, it's just a bit of a pain. Windows is still the worst OS for text and GUI scaling when it comes to large resolutions. If you can afford it and you're doing gaming 99% of the time, I'd say it's worth a shot to try this monitor out. Otherwise, you might want to stick with the 32 inch panel. The downside is that it's only 60Hz, but you do get G-sync, which can be a huge benefit for 4K at the moment considering you're going to dip below 60FPS on newer AAA titles unless you're currently running SLI Titan X Ps.


Spot on. Would like to say the exact same thing. Just like Pez I have been using a 4K 27" (PG27AQ) with 150% scaling on Win10 and although gaming was perfect with crisp clear image I too went crazy with al sorts of mini issues in windows. Just not enough room to comfortably split my screen vertically in half and 125% scaling was just too small. Not enough real estate for video editing. Lots of apps that are formatted wrong or bloated images. Citrix sessions looked like crap with scrambled text. Idem for Hyper-v machines you want to take over at console level. In short, all the Sharpness you are used to gone etc...

When I bought the PG27AQ I needed g-sync because my system couldn't handle 4K well at all back then, but the new one I built last summer kinda laughs at anything 4K and could drive it easily 100-120fps or drive a steady 60hz at 60% max gpu load with e.g The Witcher graphically maxxxed out

So since I no longer need g-sync now until the next generation of 120Hz screens and since it's gonna take ANOTHER year for even a 4K 27" version to hit the market which I now know is not sufficient for people who do more than gaming, I bit the bullet and just bought a PA328Q

That should keep me sweet until a 32" 4K 120Hz g-sync version hits the market in 1.5-2 years


----------



## juano

Quote:


> Originally Posted by *pez*
> 
> This is all with Win 10 Pro. Windows does ok with 4K, but if you spend enough time with it in a desktop environment, you will see it start to fall flat on ti's face. This is considering I've used Mac OS and RHEL/Fedora at 4K with a more consistent experience.
> 
> Also, 4K is around 33% more pixels than the resolution you are using, so it is quite the difference. I will reiterate -- using 4K at 27 inches is about the equivalent (terrible) experience as using 1440p at 24 or less inches.


Surface book is 3kx2k at 13.5" or 267ppi compared to 4k at 27" 163 ppi. So while you can disagree with the opinion he came to based on his experience, I don't think you can discount it as being unrelated or too different.


----------



## Malinkadink

Quote:


> Originally Posted by *pez*
> 
> This is all with Win 10 Pro. Windows does ok with 4K, but if you spend enough time with it in a desktop environment, you will see it start to fall flat on ti's face. This is considering I've used Mac OS and RHEL/Fedora at 4K with a more consistent experience.
> 
> Also, 4K is around 33% more pixels than the resolution you are using, so it is quite the difference. I will reiterate -- using 4K at 27 inches is about *the equivalent (terrible) experience as using 1440p at 24 or less inches*.


Using 24" 1440p here, great experience. 100% scaling.


----------



## mmms

Quote:


> Originally Posted by *pez*
> 
> I would say it depends on system specs. Honestly, 4K looks good on 27" when you play a game, but if you need to do anything else with the PC functionality wise, it's just a bit of a pain. Windows is still the worst OS for text and GUI scaling when it comes to large resolutions. If you can afford it and you're doing gaming 99% of the time, I'd say it's worth a shot to try this monitor out. Otherwise, you might want to stick with the 32 inch panel. The downside is that it's only 60Hz, but you do get G-sync, which can be a huge benefit for 4K at the moment considering you're going to dip below 60FPS on newer AAA titles unless you're currently running SLI Titan X Ps.


Do u think we'll see this exact features such as (144hz or 120hz , ips and g-sync/hdr ) with size 32 inch or +32 from asus or acer by 2018 or 2019 ?


----------



## boredgunner

Quote:


> Originally Posted by *mmms*
> 
> Do u think we'll see this exact features such as (144hz or 120hz , ips and g-sync/hdr ) with size 32 inch or +32 from asus or acer by 2018 or 2019 ?


I firmly believe so. And I fully expect there will be counterparts with roughly the same feature set and without IPS panels!


----------



## loader963

Quote:


> Originally Posted by *boredgunner*
> 
> I firmly believe so. And I fully expect there will be counterparts with roughly the same feature set and without IPS panels!


Could a TN panel even do HDR?


----------



## EniGma1987

Quote:


> Originally Posted by *loader963*
> 
> Could a TN panel even do HDR?


So far no LCD panel has been able to do the real, proper spec HDR. I doubt this monitor will either. They simply cannot go bright enough and do not have the static contrast ratio required. They can almost reach the required botom end with 384 dimming zones, but we really need about 500 zones to get there completely. And then there is the matter of the top end requirements which still aren't met.


----------



## boredgunner

Quote:


> Originally Posted by *loader963*
> 
> Could a TN panel even do HDR?


TN is out of the question when talking about things like HDR and large monitors. We should have VA counterparts to this by next year.


----------



## mtcn77

Quote:


> Originally Posted by *boredgunner*
> 
> TN is out of the question when talking about things like HDR and large monitors. We should have VA counterparts to this by next year.


TN may still be in for Samsung's new 32:9 monitors. They just emit a short strip of gamma correctly. They can still be beautiful panels, they just aren't set for a flatscreen display, no fault on their part, it is because what have tinkering scientists done to their industry.


----------



## CallsignVega

I'm curious to know what is going to happen to cable length with DP 1.3/1.4 higher bandwidth. Right now a 4K screen at 60 Hz with DP 1.2 is good for about 3 meters (10 feet).

With a huge bandwidth spike for 4K @ 144 Hz, are we talking 3 to 5 foot cable limits?


----------



## loader963

Since the tvs that accept DP are rare finds that probably will not be a common problem. Most (and there are always exceptions) peeps towers are closer than 3m to their monitors.


----------



## EniGma1987

Quote:


> Originally Posted by *CallsignVega*
> 
> I'm curious to know what is going to happen to cable length with DP 1.3/1.4 higher bandwidth. Right now a 4K screen at 60 Hz with DP 1.2 is good for about 3 meters (10 feet).
> 
> With a huge bandwidth spike for 4K @ 144 Hz, are we talking 3 to 5 foot cable limits?


No, we shouldn't. They will find ways to increase bandwidth while maintaining lengths. Same as Ethernet really. We are still on copper twisted pair and with Cat8 class 1 and 2 we are able to get 40 gbps over 10 meters and 10gbps over 100 meters.
I feel like we should just be using 8P8C Ethernet cables for out monitors. Keep them under 10 feet and we can have more bandwidth than DP1.4 on less wires. lol


----------



## pez

Quote:


> Originally Posted by *profundido*
> 
> Spot on. Would like to say the exact same thing. Just like Pez I have been using a 4K 27" (PG27AQ) with 150% scaling on Win10 and although gaming was perfect with crisp clear image I too went crazy with al sorts of mini issues in windows. Just not enough room to comfortably split my screen vertically in half and 125% scaling was just too small. Not enough real estate for video editing. Lots of apps that are formatted wrong or bloated images. Citrix sessions looked like crap with scrambled text. Idem for Hyper-v machines you want to take over at console level. In short, all the Sharpness you are used to gone etc...
> 
> When I bought the PG27AQ I needed g-sync because my system couldn't handle 4K well at all back then, but the new one I built last summer kinda laughs at anything 4K and could drive it easily 100-120fps or drive a steady 60hz at 60% max gpu load with e.g The Witcher graphically maxxxed out
> 
> So since I no longer need g-sync now until the next generation of 120Hz screens and since it's gonna take ANOTHER year for even a 4K 27" version to hit the market which I now know is not sufficient for people who do more than gaming, I bit the bullet and just bought a PA328Q
> 
> That should keep me sweet until a 32" 4K 120Hz g-sync version hits the market in 1.5-2 years


Yeah, this is exactly what I would like to see, and I myself have considered a 60Hz 4K panel again. I do like the high refresh for competitive CS:GO or OW, but everything else, I just really love how 4K looked







.

Quote:


> Originally Posted by *juano*
> 
> Surface book is 3kx2k at 13.5" or 267ppi compared to 4k at 27" 163 ppi. So while you can disagree with the opinion he came to based on his experience, I don't think you can discount it as being unrelated or too different.


Quote:


> Originally Posted by *Malinkadink*
> 
> Using 24" 1440p here, great experience. 100% scaling.


I'm not trying to start an e-peen of vision contest here. The man asked my opinion, and I answered him with my experience. I didn't discount his opinion, either







.
Quote:


> Originally Posted by *mmms*
> 
> Do u think we'll see this exact features such as (144hz or 120hz , ips and g-sync/hdr ) with size 32 inch or +32 from asus or acer by 2018 or 2019 ?


Quote:


> Originally Posted by *boredgunner*
> 
> I firmly believe so. And I fully expect there will be counterparts with roughly the same feature set and without IPS panels!


Yeah, I agree with that timeline. I just personally want a 32" panel for 4K at the bare minimum for my next 'upgrade'.


----------



## sblantipodi

is there any news on the price of this monitor?
there are people who says that it will cost from 1500 to 2000€.

personally I find that anything greater 1200€ is a deal breaker.
this monitor could be amazing on paper but surely it doesn't worth 220% more than the previous PG27AQ monitor.


----------



## mtcn77

Quote:


> Originally Posted by *sblantipodi*
> 
> is there any news on the price of this monitor?
> there are people who says that it will cost from 1500 to 2000€.
> 
> personally I find that anything greater 1200€ is a deal breaker.
> this monitor could be amazing on paper but surely it doesn't worth 220% more than the previous PG27AQ monitor.


Every [email protected] monitor had a 900€ msrp, the last time I checked, so you are getting what - 2.5x refresh rate - for 1.5x surcharge? Plus, _HDR_?


----------



## CallsignVega

Quote:


> Originally Posted by *sblantipodi*
> 
> is there any news on the price of this monitor?
> there are people who says that it will cost from 1500 to 2000€.
> 
> personally I find that anything greater 1200€ is a deal breaker.
> this monitor could be amazing on paper but surely it doesn't worth 220% more than the previous PG27AQ monitor.


Asus has stated a few times now that it's in the $2,000/€2,000 range. It's not surprising actually, this monitor packs a whole lot of new stuff in it.


----------



## sblantipodi

Quote:


> Originally Posted by *mtcn77*
> 
> Every [email protected] monitor had a 900€ msrp, the last time I checked, so you are getting what - 2.5x refresh rate - for 1.5x surcharge? Plus, _HDR_?


Yes 900€ MSRP is the price of the current 4K 60Hz monitor. 2000€ for a 4K 144Hz HDR monitor is simply too much.
Surely HDR and 144Hz does not worth 1100€ more. 1100€ more is more than the double of the price of a current 4K 60Hz monitor.

Take into consideration that 144Hz is pretty unuseful at 4K since there is no hardware that can push such a high framerate at such a high resolution,
if you are not playing minecraft at least.


----------



## loader963

Yeah TBF, while most hate the term, this monitor is as close to futureproof for the next few years as it gets if the quality is there so I wouldn't be surprised to see 1500 on it. Just hate the size of it.


----------



## mtcn77

Quote:


> Originally Posted by *sblantipodi*
> 
> Yes 900€ MSRP is the price of the current 4K 60Hz monitor. 2000€ for a 4K 144Hz HDR monitor *is simply too much.*
> Surely HDR and 144Hz does not worth 1100€ more. 1100€ more is more than the double of the price of a current 4K 60Hz monitor.
> 
> Take into consideration that 144Hz is pretty unuseful at 4K since there is no hardware that can push such a high framerate at such a high resolution,
> if you are not playing minecraft at least.


Oh, yeah? When was the last time we had a standard of reference to refer this to? This is at a 'prototype' level monitor grade at the moment.
You cannot buy this? *Houston, we had touch down!* You don't buy Eizo's, Sharp's and the latest inaugurated exclusive club member _"Asus"_ panels like you are out on a grocery spree...
Compared to 2Kp100 21:9's I say it is a good upgrade path.


----------



## sblantipodi

Quote:


> Originally Posted by *loader963*
> 
> Yeah TBF, while most hate the term, this monitor is as close to futureproof for the next few years as it gets if the quality is there so I wouldn't be surprised to see 1500 on it. Just hate the size of it.


futureproof on hardware means nothing.
this monitor is sufficient in terms of maximum brightness for an HDR monitor.

HDR standards will soon it 4000 nit, there are TVs that are at 4000 nit and 4000nit will became the next standard for HDR.
So this monitor is not futureproof at all.
2000€ is simply too much, 1500€ is simply too much.

IMHO it's price should be 1200/1300€ if the color accuracy will be ok and it will not have a lot of backlight bleed problems like it's predecessor.


----------



## ToTheSun!

Quote:


> Originally Posted by *sblantipodi*
> 
> HDR standards will soon it 4000 nit, there are TVs that are at 4000 nit and 4000nit will became the next standard for HDR.


Citation needed.

Quote:


> Originally Posted by *sblantipodi*
> 
> IMHO it's price should be 1200/1300€


So, you've made a better market analysis than Asus' experts have? Impressive.


----------



## mtcn77

Quote:


> Originally Posted by *ToTheSun!*
> 
> Citation needed.
> So, you've made a better market analysis than Asus' experts have? Impressive.


-_"Most impressive!"_


----------



## loader963

Sblantpodi I will concede that you have more info on this but I still disagree. Besides oled, they've thrown everything but the sink into this monitor and it sounds Amazing. I don't like how expensive it's gonna be but that's just new tech in general.


----------



## sblantipodi

Quote:


> Originally Posted by *ToTheSun!*
> 
> Citation needed.
> So, you've made a better market analysis than Asus' experts have? Impressive.


It's not a market analysis it's just what I feel that this monitor worth.
TVs are moving over Dolby Vision and is hitting 4000 nit, who will say us that PC monitor will not switch to Dolby Vision and will remain with HDR10?

I buy hardware since the fist pc was presented and sincerely I don't remember many things that doubled the price over previous model just because it added some cool feature.
In the last years, no components doubled it's price for "a new feature".


----------



## mtcn77

Quote:


> Originally Posted by *sblantipodi*
> 
> It's not a market analysis it's just what I feel that this monitor worth.
> TVs are moving over Dolby Vision and is hitting 4000 nit, who will say us that PC monitor will not switch to Dolby Vision and will remain with HDR10?
> 
> I buy hardware since the fist pc was presented and sincerely I don't remember many things that doubled the price over previous model just because it added some cool feature.
> In the last years, no components doubled it's price for "a new feature".


My good friend: is there any Dolby Vision compliant monitors - or TVs for that matter - yet, to hold the PG27UQ to upkeeping the same standard?


----------



## sblantipodi

Quote:


> Originally Posted by *mtcn77*
> 
> My good friend: is there any Dolby Vision compliant monitors - or TVs for that matter - yet, to hold the PG27UQ to upkeeping the same standard?


yes there are TVs with dolby vision so are we sure that HDR10 is the standard for PC?
sincerely I'm not sure, do you remember HD-DVD vs Bluray?

who knows if HDR10 will be the next HD-DVD?

One things is sure, Dolby Vision is a far better standard than HDR10.


----------



## loader963

[quote name="sblantipodi" url="/t/1620061/vc-asus-announces-swift-pg27uq-4k-ips-144hz-g-sync-hdr-monitor/330#post_25779285
In the last years, no components doubled it's price for "a new feature".[/quote]

Several things have. Plasma and oled tvs come right to mind. A 50" rptv was a couple thousand while the same size plasma was $20k. After a few years it comes down to the price realm of mere mortals where the average joe can afford it.


----------



## mtcn77

Quote:


> Originally Posted by *sblantipodi*
> 
> yes there are TVs with dolby vision so are we sure that HDR10 is the standard for PC?
> sincerely I'm not sure, do you remember HD-DVD vs Bluray?
> 
> who knows if HDR10 will be the next HD-DVD?


Can you please read the first message I posted in this thread? Thank you.
The joke is not on this monitor - it fulfills the HDR10 specification to the letter. It is actually a ploy on Nvidia users who thought they would employ the full Dolby Vision (12-bit, 10,000 contrast) standard with their would be superior hardware. The standards are established when oems establish them, not the moment your virtual support level says so.


----------



## sblantipodi

Quote:


> Originally Posted by *mtcn77*
> 
> Can you please read the first message I posted in this thread? Thank you.
> The joke is not on this monitor - it fulfills the HDR10 specification to the letter. It is actually a ploy on Nvidia users who thought they would employ the full Dolby Vision (12-bit, 10,000 contrast) standard with their would be superior hardware. The standards are established when oems establish them, not the moment your virtual support level says so.


I agree, are you sure that in 2018 PC monitors will use HDR10 instead of Dolby Vision?


----------



## mtcn77

Quote:


> Originally Posted by *sblantipodi*
> 
> I agree, are you sure that in *2018 PC monitors* will use HDR10 instead of Dolby Vision?


Well, we could look into those. *What is the time?*
Btw, some pointers: this is as fake as the DRM key encrypted content playback support of some very vocally advocated gpus in recent memory. You seem to be enjoying this vicious cycle to my amusement.


----------



## Ferreal

If you think this monitor costs too much. You probably don't have the hardware to run it anyway.

This is a huge upgrade to the best gaming monitor my favorite Asus pg278q. I will pick this up day one


----------



## mtcn77

Correct me if I'm wrong - this is the first HDR monitor, am I right, or am I right?


----------



## zanardi

LG 32UD99
BenQ SW320


----------



## outofmyheadyo

Quote:


> Originally Posted by *Ferreal*
> 
> If you think this monitor costs too much. You probably don't have the hardware to run it anyway.
> 
> This is a huge upgrade to the best gaming monitor my favorite Asus pg278q. I will pick this up day one


Why do you say the PG278Q is the best? its not IPS, there are the 165hz IPS 1440p gsync monitors, and if u want to go bananas there are the 34 inch 100hz gsync ones, pg278Q is just the middle of the pack like all the tn highrefresh ones out there, nice nonetheless but not the best, i had it a while back and loved it.


----------



## ToTheSun!

Quote:


> Originally Posted by *zanardi*
> 
> LG 32UD99
> BenQ SW320


We don't know if those will even have 1000 nt max luminance and FALD with dimming zones. My guess is they won't.


----------



## Ferreal

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Why do you say the PG278Q is the best? its not IPS, there are the 165hz IPS 1440p gsync monitors, and if u want to go bananas there are the 34 inch 100hz gsync ones, pg278Q is just the middle of the pack like all the tn highrefresh ones out there, nice nonetheless but not the best, i had it a while back and loved it.


I've tried all the monitors that came out after the pg278q with the exception of ultra wide because it's only 100hz and 4ms.

144hz with 1ms pixel response has better motion than 165hz with 4ms imo. TN has always been the better gaming monitor.

With a $2000 premium, I'm confident this monitor will will not disappoint.

I might try the 240hz monitor coming out in march to compare 144hz vs 240hz. But I think there is no going back to 1080p for me


----------



## sblantipodi

Quote:


> Originally Posted by *ToTheSun!*
> 
> We don't know if those will even have 1000 nt max luminance and FALD with dimming zones. My guess is they won't.


If they are HDR10 certified as they say they have 1000nit.
If they have 1000nit is highly unprobable that they are edge lit


----------



## OwnedINC

Quote:


> Originally Posted by *pez*
> 
> Yeah, I agree with that timeline. I just personally want a 32" panel for 4K at the bare minimum for my next 'upgrade'.


Am I the only one who wants the better pixel density¿

Seems like everyone wants a 32'' for some reason


----------



## MuscleBound

27" 4k? Way too small.


----------



## ToTheSun!

Quote:


> Originally Posted by *sblantipodi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ToTheSun!*
> 
> We don't know if those will even have 1000 nt max luminance and FALD with dimming zones. My guess is they won't.
> 
> 
> 
> If they are HDR10 certified as they say they have 1000nit.
Click to expand...

Yup, it's that simple!


----------



## finalheaven

Quote:


> Originally Posted by *sblantipodi*
> 
> yes there are TVs with dolby vision so are we sure that HDR10 is the standard for PC?
> sincerely I'm not sure, do you remember HD-DVD vs Bluray?
> 
> who knows if HDR10 will be the next HD-DVD?
> 
> One things is sure, Dolby Vision is a far better standard than HDR10.


I agree Dolby Vision is better in terms of tech, but I believe Dolby Vision is proprietary and more expensive to implement due to licensing costs. If anything like all prior format wars, it seems more likely the cheaper, open-source, and more widely supported and available format will win out. Whether its VHS/Betamax, firewire/usb and HD-DVD/Bluray.

Edit: Dolby Vision not only requires licensing fee, but an actual Dolby chip to be on the TV, further increasing costs. [However, Nvidia G-Sync does the same and I believe G-Sync is winning? - I may be wrong on that]

See below: Source



I also understand and realize that despite the above, Dolby Vision may still win out. However, if I had to bet at this moment, I would wager that HDR10 will beat Dolby Vision. And if and when people want higher specs, I believe HDR20 will be released and without the need to pay licensing fees to Dolby.


----------



## loader963

Don't think g syncs is winning per se when more monitors have free sync and that manufacturers say they'd rather use it, but since Nvidia has 70+% market share we hear about it more, especially in the higher end from people paying more for higher end monitors as well.


----------



## CallsignVega

Well G-Sync is going to stick around because:

A: It's actually better than Freesync
B: The fastest GPU's by far are NVIDIA

My largest concern with this monitor is that HDR/FALD is tied directly into G-Sync only mode and the screen still looks like poo IPS on the desktop and videos etc.


----------



## KGPrime

Quote:


> Originally Posted by *CallsignVega*
> 
> Well G-Sync is going to stick around because:
> 
> A: It's actually better than Freesync
> B: The fastest GPU's by far are NVIDIA
> 
> My largest concern with this monitor is that HDR/FALD is tied directly into G-Sync only mode and the screen still looks like poo IPS on the desktop and videos etc.


Well, since the 32" pro art has the same zoned backlighting and it doesn't have gsync it would seem like this should not be the case. About the HDR part i have the same concern.


----------



## Leopardi

Quote:


> Originally Posted by *OwnedINC*
> 
> Am I the only one who wants the better pixel density¿
> 
> Seems like everyone wants a 32'' for some reason


I want the density as well, and I don't see the appeal of over 27" in monitor usage. A 32" is so huge that it would have to just be pushed back on the desk.


----------



## Serephucus

What people don't seem to be getting is that 4K is a requirement of the HDR spec.

ASUS wanted to make a 27" HDR monitor, which meant it had to be 4K. Probably the yields weren't as good for 32" versions.


----------



## Silent Scone

Quote:


> Originally Posted by *Serephucus*
> 
> What people don't seem to be getting is that 4K is a requirement of the HDR spec.


Please elaborate.


----------



## fleetfeather

Quote:


> Originally Posted by *Serephucus*
> 
> What people don't seem to be getting is that 4K is a requirement of the HDR spec.
> 
> ASUS wanted to make a 27" HDR monitor, which meant it had to be 4K. Probably the yields weren't as good for 32" versions.


Interesting since Samsung wants to release a Freesync HDR 3440x1440 ultrawide


----------



## ToTheSun!

Quote:


> Originally Posted by *fleetfeather*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Serephucus*
> 
> What people don't seem to be getting is that 4K is a requirement of the HDR spec.
> 
> ASUS wanted to make a 27" HDR monitor, which meant it had to be 4K. Probably the yields weren't as good for 32" versions.
> 
> 
> 
> Interesting since Samsung wants to release a Freesync HDR 3440x1440 ultrawide
Click to expand...

That's because 4K isn't a requirement for "HDR". Ultra HD, as a resolution, is a requirement for the Ultra HD Premium certification, which includes the HDR10 standard. "HDR", as it stands, is not a standard; simply a reference to higher contrast and luminance than your typical "SDR" display.

A monitor being "HDR" will be the new initialism display makers will use to attach a premium to a monitor that can't bear the Ultra HD Premium certification.

This information is all neatly explained here: http://www.tftcentral.co.uk/articles/hdr.htm

As we've been accustomed, TFTC is a wealthy source of information.


----------



## fleetfeather

Quote:


> Originally Posted by *ToTheSun!*
> 
> That's because 4K isn't a requirement for "HDR". Ultra HD, as a resolution, is a requirement for the Ultra HD Premium certification, which includes the HDR10 standard. "HDR", as it stands, is not a standard; simply a reference to higher contrast and luminance than your typical "SDR" display.
> 
> A monitor being "HDR" will be the new initialism display makers will use to attach a premium to a monitor that can't bear the Ultra HD Premium certification.
> 
> This information is all neatly explained here: http://www.tftcentral.co.uk/articles/hdr.htm
> 
> As we've been accustomed, TFTC is a wealthy source of information.


This sounds, on the surface, to simply be a matter of semantics. I think people simply want the experience offered by HDR10 (or the Dolby technology), but perhaps without the 4K resolution that is being offered here.


----------



## DVLux

B&H is listing the SW320 for 1,399$. So this certainly has quite a mark-up, considering smaller panels are usually cheaper, if the 2000~ $ tag holds.

So does that mean a 400$ mark-up for G-Sync?


----------



## ToTheSun!

Quote:


> Originally Posted by *fleetfeather*
> 
> I think people simply want the experience offered by HDR10 (or the Dolby technology), but perhaps without the 4K resolution that is being offered here.


No opposition there. I was simply explaining why "4K" isn't a requirement for "HDR".


----------



## fleetfeather

Quote:


> Originally Posted by *ToTheSun!*
> 
> No opposition there. I was simply explaining why "4K" isn't a requirement for "HDR".


Thank you for the explanation and the link


----------



## EniGma1987

Quote:


> Originally Posted by *sblantipodi*
> 
> If they are HDR10 certified as they say they have 1000nit.
> If they have 1000nit is highly unprobable that they are edge lit


Quote:


> Brightness Peak 550 nits / Typical 350 nits


Barely better than a typical monitor.


----------



## sblantipodi

Quote:


> Originally Posted by *finalheaven*
> 
> I agree Dolby Vision is better in terms of tech, but I believe Dolby Vision is proprietary and more expensive to implement due to licensing costs. If anything like all prior format wars, it seems more likely the cheaper, open-source, and more widely supported and available format will win out. Whether its VHS/Betamax, firewire/usb and HD-DVD/Bluray.
> 
> Edit: Dolby Vision not only requires licensing fee, but an actual Dolby chip to be on the TV, further increasing costs. [However, Nvidia G-Sync does the same and I believe G-Sync is winning? - I may be wrong on that]
> 
> See below: Source
> 
> 
> 
> I also understand and realize that despite the above, Dolby Vision may still win out. However, if I had to bet at this moment, I would wager that HDR10 will beat Dolby Vision. And if and when people want higher specs, I believe HDR20 will be released and without the need to pay licensing fees to Dolby.


Really interesting post, I really appreciate it, thanks and rep+.
Quote:


> Originally Posted by *CallsignVega*
> 
> Well G-Sync is going to stick around because:
> 
> A: It's actually better than Freesync
> B: The fastest GPU's by far are NVIDIA
> 
> My largest concern with this monitor is that HDR/FALD is tied directly into G-Sync only mode and the screen still looks like poo IPS on the desktop and videos etc.


No reason for that, HDR is a standalone technology that can live without GSYNC.
nvidia call the new module "gsync hdr" simply because previous module is not able to handle sync while HDR flow is on.

I don't think that the new module is so new, I will bet that is the same module with a newer firmware and some power horses more.
So no way that I will give those bad guys 2000€ for a gaming cheap monitor without hardware calibration and >12 bit LUT.
those gaming monitor does not have gamma settings on OSD neither, no brightness compensation, really, 2000€ is not the right segment.


----------



## Baasha

Pascal architecture of GPUs won't allow 4K @ 144Hz with just two GPUs.

If Volta (or whatever) is out by the time this panel comes out, it would be really interesting to see if 2x of those GPUs could do 4K @ 144Hz - this would mean the Volta GPU would be incredibly more powerful than the Pascal.

Either way, it's good that a high refresh rate 4K monitor is coming out and moving away from the peasant (1440P) resolution.


----------



## sblantipodi

Quote:


> Originally Posted by *Baasha*
> 
> Pascal architecture of GPUs won't allow 4K @ 144Hz with just two GPUs.
> 
> If Volta (or whatever) is out by the time this panel comes out, it would be really interesting to see if 2x of those GPUs could do 4K @ 144Hz - this would mean the Volta GPU would be incredibly more powerful than the Pascal.
> 
> Either way, it's good that a high refresh rate 4K monitor is coming out and moving away from the peasant (1440P) resolution.


can you tell me where you read that Pascal isn't able of 4K at 144Hz please?
as far as I know the iniltration demo used at CES was runninig HDR at 144Hz.


----------



## BoredErica

Quote:


> Originally Posted by *juano*
> 
> Sure but then you'd lose the option to do 4k Blu-ray or 4k Netflix playback. You could also save a few hundred dollars by leaving out HDR but then you'd just have a PG279Q


I'd much rather ditch 4k and not have to deal with powering games or scaling than lose HDR.

I can't even find a 1440p/144hz/Gsync/IPS monitor that checks all the boxes for me and we're already talking about 4k/144hz/IPS/Gsync/HDR. Lol ok.


----------



## ToTheSun!

Quote:


> Originally Posted by *sblantipodi*
> 
> can you tell me where you read that Pascal isn't able of 4K at 144Hz please?
> as far as I know the iniltration demo used at CES was runninig HDR at 144Hz.


He didn't mean that in the strictest sense. He meant that modern games at 4K require more horsepower to run at 144 FPS or just below.


----------



## wreckless

noob question here..

so of the two current beasts (Asus 4k 144hz vs Dell 8k 60hz), which would offer the better gaming experience assuming they were both 32"? better yet, how would the differ?


----------



## boredgunner

Quote:


> Originally Posted by *wreckless*
> 
> noob question here..
> 
> so of the two current beasts (Asus 4k 144hz vs Dell 8k 60hz), which would offer the better gaming experience assuming they were both 32"? better yet, how would the differ?


Well, the 4k 144 Hz is confirmed to be 27".

8k is pointless for gaming, we can't run it at decent frame rates in anything modern.


----------



## wreckless

Quote:


> Originally Posted by *boredgunner*
> 
> Well, the 4k 144 Hz is confirmed to be 27".
> 
> 8k is pointless for gaming, we can't run it at decent frame rates in anything modern.


hypothetically speaking.. I'm just trying to figure out the reasons/science behind them

assuming two 32" monitors.. how would a 4k 144hz panel compare to a 8k 60hz panel in terms of gaming experience?


----------



## Seyumi

Quote:


> Originally Posted by *sblantipodi*
> 
> can you tell me where you read that Pascal isn't able of 4K at 144Hz please?
> as far as I know the iniltration demo used at CES was runninig HDR at 144Hz.


I think he means games. Here's the current list of games in my unplayed steam library that I can 100% confirm need 2x SLI overclocked Pascal Titan X on AIO liquid cooling just to maintain 60 FPS minimum on 4k with max settings and no ridiculous AA settings such as MSAA x8:

2014 - Assassin's Creed Unity & Far Cry 4
2015 - Assassin's Creed Syndicate, Dying Light, Fallout 4, Grand Theft Auto V, Witcher 3
2016 - Far Cry Primal, HITMAN, Homefront: The Revolution, Rise of the tomb raider

+ a lot more from 2016 & 2017 I haven't purchased yet

Safe to say, at 4k 144hz those games I listed above would be lucky to get any higher than 100 FPS. We need next gen Volta Titan's for any hope of 144hz 4k max settings.


----------



## GoLDii3

Quote:


> Originally Posted by *fleetfeather*
> 
> This sounds, on the surface, to simply be a matter of semantics. I think people simply want the experience offered by HDR10 (or the Dolby technology), but perhaps without the 4K resolution that is being offered here.


That would work only for games in general. For movies HDR is only aviable on the 4K version.


----------



## clerick

Hope they release a similar screen thats 1440p.


----------



## -terabyte-

Quote:


> Originally Posted by *DVLux*
> 
> B&H is listing the SW320 for 1,399$. So this certainly has quite a mark-up, considering smaller panels are usually cheaper, if the 2000~ $ tag holds.
> 
> So does that mean a 400$ mark-up for G-Sync?


Consider that with Asus there is a ~$100 ROG tax minimum (just look at the Asus PG348Q vs Acer X34 MSRP, same monitors mostly with the Asus costing $100 more).


----------



## Dhoulmagus

Hehe ROG tax. I like it.

I got my MG279Q (I guess not ROG but very top end at the time) for a competitive price at the time but the only alternative 144hz IPS 1440 display was the Acer XB270HU... With a much larger Gsync tax.

I can't afford to run this monitor remotely close to its potential, so no point in the 4k versions until I can put that money into graphics


----------



## loader963

Quote:


> Originally Posted by *-terabyte-*
> 
> Consider that with Asus there is a ~$100 ROG tax minimum (just look at the Asus PG348Q vs Acer X34 MSRP, same monitors mostly with the Asus costing $100 more).


Hey on amazon before xmas the rog was $100 cheaper than the x34









Edit: I know cuz I bought one lol.


----------



## Baasha

Quote:


> Originally Posted by *boredgunner*
> 
> Well, the 4k 144 Hz is confirmed to be 27".
> 
> *8k is pointless for gaming, we can't run it at decent frame rates in anything modern*.


speak for yourself


----------



## Baasha

Quote:


> Originally Posted by *sblantipodi*
> 
> can you tell me where you read that Pascal isn't able of 4K at 144Hz please?
> as far as I know the iniltration demo used at CES was runninig HDR at 144Hz.


I meant in terms of being able to play games with all (most) settings turned up and getting good framerates - near 144fps. Two Pascal GPUs won't cut it - not even for 4K @ 120Hz.
Quote:


> Originally Posted by *ToTheSun!*
> 
> He didn't mean that in the strictest sense. He meant that modern games at 4K require more horsepower to run at 144 FPS or just below.


----------



## aberrero

Subd. I'm so antsy for a new monitor.


----------



## pez

Quote:


> Originally Posted by *OwnedINC*
> 
> Am I the only one who wants the better pixel density¿
> 
> Seems like everyone wants a 32'' for some reason


Don't get me wrong; that was my point. The pixel density is great and gaming at 4K at 27" is quite amazing. However, my issue with it is desktop use. Windows doesn't scale like Android or iOS, so the user experience outside of gaming can be quite subpar.


----------



## Firann

Quote:


> Originally Posted by *ToTheSun!*
> 
> That's because 4K isn't a requirement for "HDR". Ultra HD, as a resolution, is a requirement for the Ultra HD Premium certification, which includes the HDR10 standard. "HDR", as it stands, is not a standard; simply a reference to higher contrast and luminance than your typical "SDR" display.
> 
> A monitor being "HDR" will be the new initialism display makers will use to attach a premium to a monitor that can't bear the Ultra HD Premium certification.
> 
> This information is all neatly explained here: http://www.tftcentral.co.uk/articles/hdr.htm
> 
> As we've been accustomed, TFTC is a wealthy source of information.


That was a really insightful article! Thanks for the link.

Based on that this monitor is the first and only (atm) monitor that adheres to the Ultra HD Premium certification. Also based on the article's quote on the Nvidia whitepaper, "G-Sync HDR" is Nvidia's "certificaiton" of an Ultra HD Premium + High refresh rate. Basicly they are going one step further and saying that monitor refresh rate is also important.

It also specifically states that the new G-Sync module is whats responsible for meeting the Ultra HD Premium certification + refresh rate, so it seems it is not just a "simple" chip that regulates refresh rate but helps the panel meet the other UHDP criteria. In anycase it looks like they are trying to make their products UHDP certified and not just HDR ready.


----------



## Creator

Quote:


> Originally Posted by *Serious_Don*
> 
> I can't afford to run this monitor remotely close to its potential, so no point in the 4k versions until I can put that money into graphics


That's a good thing, imo. I prefer to purchase monitors I cannot fully drive right away, that way my experience continues to improve each time upgrade the GPU. I couldn't really max out my Swift until TXP came along. And I've had this Swift for almost 2.5 years. Now I'm ready for something better.


----------



## sblantipodi

Quote:


> Originally Posted by *Baasha*
> 
> I meant in terms of being able to play games with all (most) settings turned up and getting good framerates - near 144fps. Two Pascal GPUs won't cut it - not even for 4K @ 120Hz.


Ok, I don't understood it before, I'm sorry


----------



## sblantipodi

Quote:


> Originally Posted by *Firann*
> 
> That was a really insightful article! Thanks for the link.
> 
> Based on that this monitor is the first and only (atm) monitor that adheres to the Ultra HD Premium certification. Also based on the article's quote on the Nvidia whitepaper, "G-Sync HDR" is Nvidia's "certificaiton" of an Ultra HD Premium + High refresh rate. Basicly they are going one step further and saying that monitor refresh rate is also important.
> 
> It also specifically states that the new G-Sync module is whats responsible for meeting the Ultra HD Premium certification + refresh rate, so it seems it is not just a "simple" chip that regulates refresh rate but helps the panel meet the other UHDP criteria. In anycase it looks like they are trying to make their products UHDP certified and not just HDR ready.


in that article is written:
Quote:


> Asus ROG Swift PG27UQ which has an expected retail price of around £2000 GBP at the moment.


lol. 2000GBP is near to 2500USD, Asus guys are completely crazy


----------



## Mini0510

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Right. That is why I game with a 4790k, GTX 1080 and Dell S2716DG monitor. And I HATE this Dell as it is because of the size. I listened to everyone oh 27" is awesome blah blah. But using a 24" for so long. I am use to that and much prefer it. So not about price or able to "afford" it. I cant game with a monitor that big in front of my face. I am actually thinking of getting rid of this 27" and moving back down to a 24". The smaller version of the same monitor.


There's the Dell S2417DG. Don't know if you tried it.


----------



## juano

Quote:


> Originally Posted by *CallsignVega*
> 
> Well G-Sync is going to stick around because:
> 
> A: It's actually better than Freesync
> B: The fastest GPU's by far are NVIDIA
> 
> My largest concern with this monitor is that HDR/FALD is tied directly into G-Sync only mode and the screen still looks like poo IPS on the desktop and videos etc.


I'd be really surprised if come release this monitor didn't support HDR in Netflix and UHD Blu-rays (when those both eventually come to PC). Whether or not it does any local dimming on the desktop or in youtube videos I wouldn't be sure of, but it'd be a huge mistake to have this thing not support HDR in HDCP2.2 compliant video sources like UHD Blu-ray and Netlfix.


----------



## rvectors

Quote:


> Originally Posted by *sblantipodi*
> 
> in that article is written:
> lol. 2000GBP is near to 2500USD, Asus guys are completely crazy


Putting UK prices aside, that are always crazy, I don't think that price will see the light of day. It's recommended retail pricing, there are titbits (internal sources) here and there that it'll be closer to $1600.... still freaking high.

Another point, wont their new pro level 32 inch version, based on the same tech minus G-SYNCH, be priced higher? If the $2500 mark is real for the 27 inch, marketwise, wouldn't they be pushing into price ranges that professionals look elsewhere for known pro monitor brands?


----------



## CallsignVega

Quote:


> Originally Posted by *juano*
> 
> I'd be really surprised if come release this monitor didn't support HDR in Netflix and UHD Blu-rays (when those both eventually come to PC). Whether or not it does any local dimming on the desktop or in youtube videos I wouldn't be sure of, but it'd be a huge mistake to have this thing not support HDR in HDCP2.2 compliant video sources like UHD Blu-ray and Netlfix.


What bothers me is the display industry has a real track record of doing just ridiculously stupid things. I'm not 100% convinced the above you posted will happen until I see it.


----------



## profundido

Quote:


> Originally Posted by *wreckless*
> 
> hypothetically speaking.. I'm just trying to figure out the reasons/science behind them
> 
> assuming two 32" monitors.. how would a 4k 144hz panel compare to a 8k 60hz panel in terms of gaming experience?


The 4K 144hz screen would give you a much much nicer gaming experience in most of the games because it will feel smoother when you move stuff or look around in 3D games which is typically what games do. The basic science summary behind it is simply that the 60-120HZ increase is an improvement that most humans are able to perceive and thus 'feel' whereas at any reasonable viewing distance for 32" the increase from 4K to 8K would no longer be unless you hang your nose straight in front of the screen (which is unrealistic)


----------



## profundido

Quote:


> Originally Posted by *pez*
> 
> Don't get me wrong; that was my point. The pixel density is great and gaming at 4K at 27" is quite amazing. However, my issue with it is desktop use. Windows doesn't scale like Android or iOS, so the user experience outside of gaming can be quite subpar.


There is new hope. I just finished testing preview build 15007 of Redstone 2 (the next windows 10 release that we expect somewhere in march) and in this build ALOT of the scaling issues are fixed or vastly improved apparently. Especially all windows fonts in mmc consoles etc are now scaling properly. Also brand new extra options to override DPI scaling per application that work surprisingly well.

For me at first looks this will be a gamechanger. Window 10 desktop @ 4K/27" suddenly becomes alot more attractive again (about time...)


----------



## un1b4ll

Quote:


> Originally Posted by *profundido*
> 
> For me at first looks this will be a gamechanger. Window 10 desktop @ 4K/27" suddenly becomes alot more attractive again (about time...)


That's awesome to hear, thanks!

Can't wait for this display to come out, I agree with the poster earlier who said they prefer to get a display and let their system grow into it. Plus, Gsync's heaviest value is ~40-50fps anyways, so it makes perfect sense.


----------



## DADDYDC650

Quote:


> Originally Posted by *un1b4ll*
> 
> That's awesome to hear, thanks!
> 
> Can't wait for this display to come out, I agree with the poster earlier who said they *prefer to get a display and let their system grow into it*. Plus, Gsync's heaviest value is ~40-50fps anyways, so it makes perfect sense.


Or you can wait until GPU's are able to render most new games at 4K + 100 frames or higher. By the time most games feature HDR and GPU's are that powerful, bigger and better monitors will be out.


----------



## sblantipodi

Quote:


> Originally Posted by *rvectors*
> 
> Putting UK prices aside, that are always crazy, I don't think that price will see the light of day. It's recommended retail pricing, there are titbits (internal sources) here and there that it'll be closer to $1600.... still freaking high.
> 
> Another point, wont their new pro level 32 inch version, based on the same tech minus G-SYNCH, be priced higher? If the $2500 mark is real for the 27 inch, marketwise, wouldn't they be pushing into price ranges that professionals look elsewhere for known pro monitor brands?


if LG could sell this for 950 USD
https://www.overclock3d.net/news/gpu_displays/lg_s_4k_hdr10_32ud99_monitor_will_release_in_april_for_950/1

I really don't understand why Asus want 2000 USD plus or minus for the PG27UQ


----------



## outofmyheadyo

60hz s about as useful as a potato


----------



## darealist

P
Quote:


> Originally Posted by *sblantipodi*
> 
> if LG could sell this for 950 USD
> https://www.overclock3d.net/news/gpu_displays/lg_s_4k_hdr10_32ud99_monitor_will_release_in_april_for_950/1
> 
> I really don't understand why Asus want 2000 USD plus or minus for the PG27UQ


Because FALD >>> edge-lit that can't display HDR properly.


----------



## sblantipodi

Quote:


> Originally Posted by *darealist*
> 
> P
> Because FALD >>> edge-lit that can't display HDR properly.


where did you read that the LG is edge lit?
in any case FALD surely don't worth 1000 USD more


----------



## DADDYDC650

Quote:


> Originally Posted by *outofmyheadyo*
> 
> 60hz s about as useful as a potato


I've got no problems with 60Hz. I destroy newbs without issue.


----------



## CallsignVega

Quote:


> Originally Posted by *sblantipodi*
> 
> if LG could sell this for 950 USD
> https://www.overclock3d.net/news/gpu_displays/lg_s_4k_hdr10_32ud99_monitor_will_release_in_april_for_950/1
> 
> I really don't understand why Asus want 2000 USD plus or minus for the PG27UQ


I don't think you can really compare a basic 60 Hz DP 1.2 monitor with edge-lit HDR vs:

4K G-Sync DP 1.4 HDR with FALD backlight running at 144 Hz that will also get much brighter. Completely different league.

The _new DP 1.4_ G-Sync chip alone costs ~$300 and the FALD addition to the LCD panel is +$500.


----------



## juano

Quote:


> Originally Posted by *CallsignVega*
> 
> The _new DP 1.4_ G-Sync chip alone costs ~$300 and the FALD addition to the LCD panel is +$500.


Gsync chip costs who $300? Because it certainly doesn't cost Nvidia $300, and I'd be willing to bet it doesn't cost monitor manufacturers anywhere near that either but I'd love to see a source to prove me wrong. If you're just saying well it must cost $300 because that's how much more a gsync version of this would be compared to a hypothetical otherwise identical freesync display then that's pretty wild assumption.

Again I hope you're trying to justify the price difference that we as consumers see between the two monitors rather than trying to say that Asus is spending $500+ per FALD compared to a typical backlight. I'd love to see any source for typical BoM for monitor parts ( I searched and couldn't find any) but until then I have a hard time imagining that a monitor Asus sells for $800 is costing them $700 in parts.


----------



## CallsignVega

G-Sync chip costs the consumer $300. These are relatively low volume chips that take significant resources and monies to design and manufacture. NVIDIA has to sell them to OEM monitor manufactures, and sell them at a profit. OEM manufacturers have to put them in a monitor and sell them at a profit. If that wasn't the case, why would either party go along with it? I know everyone wants everything for free these days, but that's not how economics works.

These monitors will cost OEM's easily around $1000 to manufacture, especially when R&D costs are factored in. Now add in packaging, marketing, shipping, warranty, repair, returns, tariffs, re-seller profit margins and OEM profit margins. Welcome to $1,999. I'll be buying one on day one as I feel the cost is justified.


----------



## DADDYDC650

$2000 for a 27" monitor? LoL!


----------



## shhek0

Quote:


> Originally Posted by *profundido*
> 
> There is new hope. I just finished testing preview build 15007 of Redstone 2 (the next windows 10 release that we expect somewhere in march) and in this build ALOT of the scaling issues are fixed or vastly improved apparently. Especially all windows fonts in mmc consoles etc are now scaling properly. Also brand new extra options to override DPI scaling per application that work surprisingly well.
> 
> For me at first looks this will be a gamechanger. Window 10 desktop @ 4K/27" suddenly becomes alot more attractive again (about time...)


Good news. Would definetile keep an eye for the build when it comes out as I am planning a 27 4k monitor in the near future. ( I just have problems with big screens- my TV is 40' and I am watching it from ~2.5m(~8 feet))


----------



## pez

People seem to forget how much those 27 and 30 inch IPS Dell Ultrasharps used to be.
Quote:


> Originally Posted by *profundido*
> 
> There is new hope. I just finished testing preview build 15007 of Redstone 2 (the next windows 10 release that we expect somewhere in march) and in this build ALOT of the scaling issues are fixed or vastly improved apparently. Especially all windows fonts in mmc consoles etc are now scaling properly. Also brand new extra options to override DPI scaling per application that work surprisingly well.
> 
> For me at first looks this will be a gamechanger. Window 10 desktop @ 4K/27" suddenly becomes alot more attractive again (about time...)


That's actually really good news. I guess I'll have to pay attention more closely and look into 4K again once it's a bit more somtachable







.


----------



## x3sphere

Quote:


> Originally Posted by *pez*
> 
> People seem to forget how much those 27 and 30 inch IPS Dell Ultrasharps used to be.
> That's actually really good news. I guess I'll have to pay attention more closely and look into 4K again once it's a bit more somtachable
> 
> 
> 
> 
> 
> 
> 
> .


I would say people forget because the Dell 30" tanked in price quickly. I remember it launched at over $2K and I got one the following year for just $1200. And also, Dell frequently ran discounts.

The same doesn't seem to.happen with gaming monitors. The X34 has been out for over a year now and it's still around $1100-1200 everywhere, only $100-200 off MSRP.


----------



## LunaTiC123

but... they're gaming monitors... GAMING for GAMERS


----------



## profundido

Quote:


> Originally Posted by *pez*
> 
> People seem to forget how much those 27 and 30 inch IPS Dell Ultrasharps used to be.
> That's actually really good news. I guess I'll have to pay attention more closely and look into 4K again once it's a bit more somtachable
> 
> 
> 
> 
> 
> 
> 
> .


After testing succesfully on the test machine at work with a 24" screen I was pleased enough with the results to take a risk and upgrade my Windows 10 (Redstone 1 1607 build) at home with the 27" 4K screen to this new build (Redstone 2 Insider Preview build 15007) and after doing so succesfully in about 10-15 minutes I can confirm I'm quite happy with it. I don't use multiple screens so my risk profile is quite low anyway.

To help other people here who might like to experience this for themselves I'll post a link with this build right here. Microsoft has only made the ESD files for this build public since a week but another internet user was crazy enough to go through the work of compiling all the ISO files out of them and upload them here:

http://pastebin.com/6Nqpt6Xk

Feel free to download the appropriate ISO - AT YOUR OWN RISK - and see for yourself how you like these changes.

When you upgrade your existing build to this beta-build it will automatically put you on Microsoft's slow insider preview ring for windows updates instead the default productional release ring you are probably currently on.

If you're scared that the build might be too buggy I can confirm that I played my favorite games last night succesfully all night long and tested all my most used programs for video editing etc without any problem or noticeable difference so far.

If unsure you can Acronis backup your entire system drive before upgrading and revert if you don't like it.

Be bold and enjoy these new scaling options already right now !


----------



## DADDYDC650

You can't assure folks that the latest beta build is safe and warn them to try at their own risk at the same time, lol!


----------



## profundido

Quote:


> Originally Posted by *DADDYDC650*
> 
> You can't assure folks that the latest beta build is safe and warn them to try at their own risk at the same time, lol!


indeed, and that's why I did not do such a thing. I merely told my personal experience to give others at least some base of reference so they know what sort of state this build is in roughly. It makes clear that this is e.g not an early alpha build that crashes every 5 minutes....but let it be clear that I can nor will assure anything


----------



## DADDYDC650

Quote:


> Originally Posted by *profundido*
> 
> indeed, and that's why I did not do such a thing. I merely told my personal experience to give others at least some base of reference so they know what sort of state this build is in roughly. It makes clear that this is e.g not an early alpha build that crashes every 5 minutes....but let it be clear that I can nor will assure anything


Seems like you gave out some assurance by stating how all your programs were stable for hours. I wouldn't install a beta-build on my main rig that's for sure. Perhaps if I was dual booting or on a backup rig.


----------



## CoD511

Quote:


> Originally Posted by *CallsignVega*
> 
> G-Sync chip costs the consumer $300. These are relatively low volume chips that take significant resources and monies to design and manufacture. NVIDIA has to sell them to OEM monitor manufactures, and sell them at a profit. OEM manufacturers have to put them in a monitor and sell them at a profit. If that wasn't the case, why would either party go along with it? I know everyone wants everything for free these days, but that's not how economics works.
> 
> These monitors will cost OEM's easily around $1000 to manufacture, especially when R&D costs are factored in. Now add in packaging, marketing, shipping, warranty, repair, returns, tariffs, re-seller profit margins and OEM profit margins. Welcome to $1,999. I'll be buying one on day one as I feel the cost is justified.


I might be stuck in the past but when it became a reality, didn't it cost $200 for early access and then later intend to only add a premium of $100 to total monitor cost? Or did something change since I'm not aware of any change from the existing FPGA that has had simply new logic created as needed (not a small task it seemed here) and then the actual board just is modified with extra inputs and so on whilst the module remains always the same.

Nvidia has offered free work towards developers in many ways before simply to further future interests of the long-term expansion of the GeForce market. GPU sales are still their primary revenue and a few monitors they provide calibration with Gsync and at the least, breaking even with the FPGA, means customers are locked to GeForce and no one upgrades their monitor as frequently as GPUs; they're quite happy to go for the long play and establishing themselves a platform and not a chip maker was a highlight on their maker approach.

Gsync also has the average of being the only available solution that is able to seemingly, drive the logic for the resolution at such a huge refresh rate whislt maintaining no evident ghosting, negligible added latency and a notable lack of overshoot. Develolnent of the same solution by a monitor maker would be a significant investment for the delay controller and compatibility would be questionable. It's clear no other monitor can not match the capacity of Gsync, TFTCentral conceded Gsync was a superb display controller for image quality and motion.

But no R&D cost to just implement a panel with a Gsync FPGA calibrated per monitor, by Nvidia. Nothing is needed in R&D cost from ASUS here besides making a simply PCB to host Gsync. Maybe a fan and heatzink too. I don't think it'd run cool with such a heavy proceeding load 144 times per second, whilst maintaining every pixel to perfect timing as I believe it does dynamically conttol overdrive for all of them.


----------



## CallsignVega

Yes, I find G-Sync superior to Freesync too for the reasons you mentioned. But don't underestimate the challenges of designing and producing a DP 1.4 TCon which this new G-Sync chip is.

Just as a reference, from HDMI 2.0 spec release to the first full 18Gbps chips appearing in consumer electronics took ~27 months. DP 1.3 spec release to the first computer monitor scheduled to have that TCon (Dell 8K on March 23rd) will be over 30 months. That's some serious development and production times for a simple Tcon.


----------



## Sedolf

If the 32'' PA32U is really supposed to cost around $1899, then most of the extra dough you have to pay for the FALD.
It won't have G-sync and high refresh like the smaller gaming brother but is still very expensive.


----------



## DVLux

Quote:


> Originally Posted by *Sedolf*
> 
> If the 32'' PA32U is really supposed to cost around $1899, then most of the extra dough you have to pay for the FALD.
> It won't have G-sync and high refresh like the smaller gaming brother but is still very expensive.


That's still 500$ more than the SW320 and 800$ more than the LG32UD99. X_X What is with these mark-ups? Quantum Dots?


----------



## DADDYDC650

Was honestly thinking of maybe trying out this monitor upon release so I hooked up my 27" Dell U2713HM as well as an Acer 34" Ultrawide 1440p monitor. The monitors are just too damn small for me. The 27" is small in general and the Ultrawide needs more vertical space. Colors and contrast don't match up with my KS8000 either. Now this new Asus has FALD and Quantum Dot so I'm guessing the colors and contrast will be much improved. Too damn bad about the size though. I need at least 32"-40" 16:9 or a 38" UltraWide.

Just for giggles, I ran Rocket league at 3440x1440p and 3840x1600p. I need a 38" 3840x1600p with G-Sync, 144Hz, HDR + FALD....









34" 3440x144p



38" 3840x1600p (latest LG Ultrawide panel)


----------



## pez

I'm currently trying out 16:9 again myself with a Gsync panel, though I'm still pretty torn.


----------



## Dragonsyph

If you have the GPU(s) power to run 4k 144hz then your not going to care about the price.


----------



## chrisnyc75

Quote:


> Originally Posted by *Dragonsyph*
> 
> If you have the GPU(s) power to run 4k 144hz then your not going to care about the price.


Yep. And this is exactly why this sort of bleeding edge hardware can be sold at such high prices. If you put several thousand dollars (and who knows how many days) into building a cutting edge high end machine there's nothing more frustrating than being bottlenecked by the output device (i.e. monitor). Anybody who has already invested in multiple 1080s or Titans and a shiny new 10-core i7 isn't going to think twice about dropping $1500 on this monitor. And if that's not you, you don't need this monitor yet anyway.


----------



## TheCautiousOne

Quote:


> Originally Posted by *chrisnyc75*
> 
> Yep. And this is exactly why this sort of bleeding edge hardware can be sold at such high prices. If you put several thousand dollars (and who knows how many days) into building a cutting edge high end machine there's nothing more frustrating than being bottlenecked by the output device (i.e. monitor). Anybody who has already invested in multiple 1080s or Titans and a shiny new 10-core i7 isn't going to think twice about dropping $1500 on this monitor. And if that's not you, you don't need this monitor yet anyway.


Could be another way to look at this honestly.

I have dual 980sc in SLI. I can push most games about 45 FPS on a 32" 2160p monitor right now.

Adding GSync to the mix I think would improve my gaming experience exponentially, but paying out 1500$ On a monitor that is a certain size, etc, etc for something that might not be the exact thing I need or feel I warrant spending my money on is the hinderance.

TCO


----------



## iDShaDoW

I'm waiting to see them announce the successor to the PG348Q so I can compare it and the pricing to this myself.

Does anyone know if there's anything out there in regards to that? Will work my way backwards but 43 pages of this thread is a lot to do lol...


----------



## CallsignVega

A 3440x1440 144 Hz display is also expected in Q3/Q4. Although, almost assuredly it won't have the lofty specs that this monitor has with HDR and FALD.

There is also a rumored 144 Hz version of the LG 3840x1600 38" out there for late this year. That would be real interesting.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> A 3440x1440 144 Hz display is also expected in Q3/Q4. Although, almost assuredly it won't have the lofty specs that this monitor has with HDR and FALD.
> 
> There is also a rumored 144 Hz version of the LG 3840x1600 38" out there for late this year. That would be real interesting.


38" 144Hz sounds interesting. Too bad it won't feature HDR.


----------



## BoredErica

Quote:


> Originally Posted by *CallsignVega*
> 
> A 3440x1440 144 Hz display is also expected in Q3/Q4. Although, almost assuredly it won't have the lofty specs that this monitor has with HDR and FALD.
> 
> There is also a rumored 144 Hz version of the LG 3840x1600 38" out there for late this year. That would be real interesting.


I just want a pimped out 2560 x 1440p display (but with glossy panel).

Methinks I won't get it before heat death of the universe.


----------



## DADDYDC650

Does anyone know if this monitor will have a 10 bit panel that's capable of at least 1000 nits?


----------



## ToTheSun!

Quote:


> Originally Posted by *Darkwizzie*
> 
> I just want a pimped out 2560 x 1440p display (but with glossy panel).
> 
> Methinks I won't get it before heat death of the universe.


At this rate, we'll probably get OLED monitors before we get decent glossy monitors.


----------



## Vipu

Quote:


> Originally Posted by *Dragonsyph*
> 
> If you have the GPU(s) power to run 4k 144hz then your not going to care about the price.


Pretty sure 1x 1080 can run that, you just have to lower settings from ultra


----------



## CallsignVega

Quote:


> Originally Posted by *Darkwizzie*
> 
> I just want a pimped out 2560 x 1440p display (but with glossy panel).
> 
> Methinks I won't get it before heat death of the universe.


You could always use the paper towel AR film removal method on this new Asus! Haha even I would never attempt that on a $2K display.


----------



## ToTheSun!

Quote:


> Originally Posted by *CallsignVega*
> 
> You could always use the paper towel AR film removal method on this new Asus! Haha even I would never attempt that on a $2K display.


Wasn't AUO "hard" gluing the coating to the panel recently for AHVA models? I remember reading something of that sort. If that were the case, even the paper towel method wouldn't do it.


----------



## iDShaDoW

Quote:


> Originally Posted by *CallsignVega*
> 
> A 3440x1440 144 Hz display is also expected in Q3/Q4. Although, almost assuredly it won't have the lofty specs that this monitor has with HDR and FALD.


Ah, that sucks that it probably won't have HDR or FALD.

I had read articles awhile back that they were expecting monitors at 3440x1440 to be capable of native refresh of like 185Hz or so with the new standards. And that 1080p would be able to do 240Hz.

Hopefully they leak some details soon.


----------



## CeeeJaaay

There
Quote:


> Originally Posted by *iDShaDoW*
> 
> Ah, that sucks that it probably won't have HDR or FALD.
> 
> I had read articles awhile back that they were expecting monitors at 3440x1440 to be capable of native refresh of like 185Hz or so with the new standards. And that 1080p would be able to do 240Hz.
> 
> Hopefully they leak some details soon.


There's already a 1080p 240 Hz panel for sale, multiple monitors use it. As for the 3440x1440 144 Hz monitor I don't see it coming before this time next year...


----------



## hanzy

Ah.
For one wish this was larger. Could be a real nice next to an ultrawide for certain things.
Secondly, I don't know if I can go back to widescreen from ultrawide for a main PC monitor ever again.
Good to see some HDR displays making there way to PC market.
Enjoying the fairly limited content on my TV right now. Man in the High Castle looks really awesome in 4k HDR.

I think if an HDR, VRR(preferably G-Sync because I am with NV now), 100Hz+, 38" 3840x1600 released I would be set for awhile.
Of course that's what I said about my X34.


----------



## pez

Quote:


> Originally Posted by *Vipu*
> 
> Pretty sure 1x 1080 can run that, you just have to lower settings from ultra


Single 1080? Just no.


----------



## Dhoulmagus

Quote:


> Originally Posted by *pez*
> 
> Single 1080? Just no.


I get your point but..
What about those of us who don't care to play the latest and greatest? My 280X does just fine enough for me still at 1440P / 144hz on games I play like Morrowind, Oblivion, Dark Souls (modded), Borderlands (somewhat), Half life, silly games like hotline miami, burnout paradise, COD 1 & 2, and so on and so on, and those older games look way better on these newer screens. A high end CPU was more important for my needs based on the games I prefer. On the rare occasion that I care to play something like GTA: V I just change it to 1080P and play on a large screen from farther away anyway. Big deal.

Many of us would be totally fine with a 1080 and this screen.


----------



## pez

Quote:


> Originally Posted by *Serious_Don*
> 
> I get your point but..
> What about those of us who don't care to play the latest and greatest? My 280X does just fine enough for me still at 1440P / 144hz on games I play like Morrowind, Oblivion, Dark Souls (modded), Borderlands (somewhat), Half life, silly games like hotline miami, burnout paradise, COD 1 & 2, and so on and so on, and those older games look way better on these newer screens. A high end CPU was more important for my needs based on the games I prefer. On the rare occasion that I care to play something like GTA: V I just change it to 1080P and play on a large screen from farther away anyway. Big deal.
> 
> Many of us would be totally fine with a 1080 and this screen.


It is doable for older titles, sure, but that stipulation needs to be made clear first. Some people can and will compromise, but I'm not sure people that bought a GTX 1080 are running out to buy a $1800 panel for the 144hz aspect when they can possibly run AAA titles from ~5-10 years ago.


----------



## Vipu

Quote:


> Originally Posted by *pez*
> 
> It is doable for older titles, sure, but that stipulation needs to be made clear first. Some people can and will compromise, but I'm not sure people that bought a GTX 1080 are running out to buy a $1800 panel for the 144hz aspect when they can possibly run AAA titles from ~5-10 years ago.


Well we could throw BF1 and Doom2016 just as examples that could run 144hz 4k with single 1080 if you lower some settings.


----------



## pez

Quote:


> Originally Posted by *Vipu*
> 
> Well we could throw BF1 and Doom2016 just as examples that could run 144hz 4k with single 1080 if you lower some settings.


I'm happy to concede if proof is given







. 144+ FPS would be impressive for a single 1080 at 4K.


----------



## Kinaesthetic

Quote:


> Originally Posted by *pez*
> 
> It is doable for older titles, sure, but that stipulation needs to be made clear first. Some people can and will compromise, but I'm not sure people that bought a GTX 1080 are running out to buy a $1800 panel for the 144hz aspect when they can possibly run AAA titles from ~5-10 years ago.


If it helps to prove that we exist, I was one of the first buyers of the PG278Q when it landed in the USA ($850 monitor after tax) and currently run a GTX 1080 in my system. The game I most often play is osu! where with unlocked frame rate, I'm getting about 4300fps. Rarely play current gen AAA titles because they are almost always generally crap and not fun for me to play.

And yes, I'm highly debating on getting this monitor. HIGHLY debating on it.


----------



## pez

Quote:


> Originally Posted by *Kinaesthetic*
> 
> If it helps to prove that we exist, I was one of the first buyers of the PG278Q when it landed in the USA ($850 monitor after tax) and currently run a GTX 1080 in my system. The game I most often play is osu! where with unlocked frame rate, I'm getting about 4300fps. Rarely play current gen AAA titles because they are almost always generally crap and not fun for me to play.
> 
> And yes, I'm highly debating on getting this monitor. HIGHLY debating on it.


Sure, I'm not saying you and people like you don't exist, but either statement (mine or his) was necessarily correct







.


----------



## ILoveHighDPI

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vipu*
> 
> Well we could throw BF1 and Doom2016 just as examples that could run 144hz 4k with single 1080 if you lower some settings.
> 
> 
> 
> I'm happy to concede if proof is given
> 
> 
> 
> 
> 
> 
> 
> . 144+ FPS would be impressive for a single 1080 at 4K.
Click to expand...

I don't have "proof" but I just got off my computer with a 980Ti playing DOOM on a 4K monitor. I try to keep the FPS above 100 as much as possible since it drastically affects the fluidity of that game regardless of whether your monitor is 120hz or not.

Turning all the graphics settings down as much as possible except basic shadow quality, texture detail, and particles (which don't seem to affect framerate), at 4K native I usually get 70-80 fps. Of course the extremes go further but that seems to be the general range. To keep the framerate above 100fps consistently I turn the dynamic resolution scaling down to 75%, which is about 3K resolution, at that point the range fluctuates pretty wildly from 100fps to 130fps, and again extremes go above and below.

Looking at the disparity between the 1080 and 980Ti at 4K (http://www.pcgamer.com/doom-benchmarks-return-vulkan-vs-opengl/2/), according to my average 75fps with a 980Ti at 4K, it looks like a 1080 would get you to an average of about 100fps.
I'm betting someone would be comfortable using 10% downscaling with a 1080 instead of the 25% that I currently use, but not quite native 4K.


----------



## pez

Quote:


> Originally Posted by *ILoveHighDPI*
> 
> I don't have "proof" but I just got off my computer with a 980Ti playing DOOM on a 4K monitor. I try to keep the FPS above 100 as much as possible since it drastically affects the fluidity of that game regardless of whether your monitor is 120hz or not.
> 
> Turning all the graphics settings down as much as possible except basic shadow quality, texture detail, and particles (which don't seem to affect framerate), at 4K native I usually get 70-80 fps. Of course the extremes go further but that seems to be the general range. To keep the framerate above 100fps consistently I turn the dynamic resolution scaling down to 75%, which is about 3K resolution, at that point the range fluctuates pretty wildly from 100fps to 130fps, and again extremes go above and below.
> 
> Looking at the disparity between the 1080 and 980Ti at 4K (http://www.pcgamer.com/doom-benchmarks-return-vulkan-vs-opengl/2/), according to my average 75fps with a 980Ti at 4K, it looks like a 1080 would get you to an average of about 100fps.
> I'm betting someone would be comfortable using 10% downscaling with a 1080 instead of the 25% that I currently use, but not quite native 4K.


Play-ability is definitely achievable if the person is willing to make compromises for sure







.


----------



## Dragonsyph

Well i get about 60-70 FPS on max settings 4k in BF1, you might just be able to get 144 on lowest settings, im to lazy to load the game up and check rofl.

But spending thousands on GPUs and this monitor your not the kinda of person who spent all that just to lower settings.

I like everything maxed out, thats why i bought a high end GPU.


----------



## pez

Quote:


> Originally Posted by *Dragonsyph*
> 
> Well i get about 60-70 FPS on max settings 4k in BF1, you might just be able to get 144 on lowest settings, im to lazy to load the game up and check rofl.
> 
> But spending thousands on GPUs and this monitor your not the kinda of person who spent all that just to lower settings.
> 
> I like everything maxed out, thats why i bought a high end GPU.


I see you're running a x34 (also?)? How are you liking the 1080 for that res?


----------



## Dragonsyph

Quote:


> Originally Posted by *pez*
> 
> I see you're running a x34 (also?)? How are you liking the 1080 for that res?


I'll let you know in a few days, it should be here today. Excited to try ultra wide out.


----------



## pez

Quote:


> Originally Posted by *Dragonsyph*
> 
> I'll let you know in a few days, it should be here today. Excited to try ultra wide out.


Cool! I look forward to seeing your thoughts. I picked up a 1080 (albeit only the SC/ACX 3.0) to see what the performance is like (Titan noisiness was getting to me) and it seems to be a pretty solid card for the resolution. I think Crysis 3 is the only big 'compromise' title I've come across so far with the combo of what I've played so far.


----------



## Dragonsyph

Quote:


> Originally Posted by *pez*
> 
> Cool! I look forward to seeing your thoughts. I picked up a 1080 (albeit only the SC/ACX 3.0) to see what the performance is like (Titan noisiness was getting to me) and it seems to be a pretty solid card for the resolution. I think Crysis 3 is the only big 'compromise' title I've come across so far with the combo of what I've played so far.


Nice, ya I love that I can't hear my card at all. Ya Crysis 3 is brutal. I'll be happy with 75fps+ but solid 100 would be better for most games. Im guessing Crysis is something around 50? Think the only game I play that will have trouble will be Witcher 3, but who needs hair works.


----------



## boredgunner

Quote:


> Originally Posted by *Serious_Don*
> 
> I get your point but..
> What about those of us who don't care to play the latest and greatest? My 280X does just fine enough for me still at 1440P / 144hz on games I play like Morrowind, Oblivion, Dark Souls (modded), Borderlands (somewhat), Half life, silly games like hotline miami, burnout paradise, COD 1 & 2, and so on and so on, and those older games look way better on these newer screens. A high end CPU was more important for my needs based on the games I prefer. On the rare occasion that I care to play something like GTA: V I just change it to 1080P and play on a large screen from farther away anyway. Big deal.
> 
> Many of us would be totally fine with a 1080 and this screen.


Latest and greatest? Today's counterparts to the games you listed are typically awful in comparison (and in general).

But others have commented on "growing into" this monitor, it being future proof. G-SYNC really allows for this. So I'd actually have no issue getting this monitor with only my GTX 1080. Ultra settings is usually overrated and not much different visually than High, I'm okay with dealing with ~60 FPS for a short time, plus I'll concentrate on older and greater games (which I generally find myself playing anyway). I hope this monitor has ULMB, and I hope it has an improved version of ULMB at that using Samsung's strobing method.


----------



## CallsignVega

That's the beauty of G-Sync is that it allows the monitor to flow with technology and the demand. Even if you run this monitor at 80-100 FPS/Hz it's going to be amazing versus 60 Hz displays. All with 4K clarity, HDR brightness and FALD contrast/black depth.


----------



## caenlen

I can't wait for it personally, I will be doing my ultimate PC build summer 2018 so I am sure something will even be better by then, if not, this is the one I am doing.

i7-8800k + GTX 1180 ti = my ultimate rig

just have to finish school and she will be mine... and the power she will wield!!! will make mortals tremble!!!!


----------



## DVLux

Buying, and using, this now, isn't really "future-proof". In so far that the backlighting will dim long before you buy that "Killer" GPU in the "future".


----------



## boredgunner

Quote:


> Originally Posted by *DVLux*
> 
> Buying, and using, this now, isn't really "future-proof". In so far that the backlighting will dim long before you buy that "Killer" GPU in the "future".


Yeah, high refresh rate is just one of many features of this monitor. FALD with 384 dimming zones is the only reason I'm even considering this monitor.


----------



## pez

Quote:


> Originally Posted by *Dragonsyph*
> 
> Nice, ya I love that I can't hear my card at all. Ya Crysis 3 is brutal. I'll be happy with 75fps+ but solid 100 would be better for most games. Im guessing Crysis is something around 50? Think the only game I play that will have trouble will be Witcher 3, but who needs hair works.


Yeah, I'm about 3-4 hours into the game, so the portion where you first go outside (grass-mania) is where I've seen the biggest hit so far. The game essentially runs anywhere from 60-130 with the Titan X P and the x34. Satisfyingly enough, the 1080 performs the same or +/-5% at normal 1440p. I'm currently determining whether or not I want to keep the x34 and Titan or finally have my PC quiet again like I desire with the 1440p/144hz display and the 1080. All I can say is that 144hz is super addicting in the games that you can run it at (i.e. Doom, Overwatch, etc.).


----------



## KGPrime

Quote:


> Originally Posted by *boredgunner*
> 
> FALD with 384 dimming zones is the only reason I'm even considering this monitor.


+1 It's the one choice for the first lcd monitor i will ever buy keep and use in my life. I'm not coming from fw900's to some crap glowy Ips pos nor some smeary Va pos if i don't have to. And i won't even be gaming on it much. I'll be reading, watching youtube, netflix, working in photoshop, modding games ect. I'll happily pay to 2k and a handy to the bastard that brings me a non glowy, decent contrast high color IPS with low input lag and high Hz with hopefully less blur even on desktop - that is a normal sized PC monitor, not a 40" Tv.

The Pro Art is also an option, but i really do want the highest Hz for the lowest amount of blur and more options for setting custom Hz if possible, even just for desktop usage. If the price is within a few hundred bucks i'll take the Hz. 32" is kind of big anyway.

I generally will not waste money on things i do not need, or cannot make the best use of, but this is an extreme case. For the last 15 or whatever years that LCDs have been sold they have always been crap, and i have been waiting and biding my time avoiding them to hopefully out live them for the next better thing, or that they improved enough i could tolerate them. Now my Crts have finally all died, now is the time, please let it be the time.

No matter how excited ( yet deeply cynical ) i am for it though. No way i'm buying it sight unseen. Probably not even before at least one revision as it's likely going to have some issues at launch. I mean realistically.


----------



## Dragonsyph

Quote:


> Originally Posted by *pez*
> 
> Yeah, I'm about 3-4 hours into the game, so the portion where you first go outside (grass-mania) is where I've seen the biggest hit so far. The game essentially runs anywhere from 60-130 with the Titan X P and the x34. Satisfyingly enough, the 1080 performs the same or +/-5% at normal 1440p. I'm currently determining whether or not I want to keep the x34 and Titan or finally have my PC quiet again like I desire with the 1440p/144hz display and the 1080. All I can say is that 144hz is super addicting in the games that you can run it at (i.e. Doom, Overwatch, etc.).


OH man first thing i can notice is the higher refresh rate, going from 60 to 100 i could instantly notice the difference. OC to 100 and past the test, has very little back light bleed. Getting 70-100 fps in bf1 max settings, think im cpu limited in bf1 because stock vs OC 1080 gets about the same fps.

You know of any guides for best settings on the x34? Haven't messed with anything yet besides OC.


----------



## mmms

can i down refresh rate to 60hz with rpg games such as the witcher 3 and up to 144hz with fps and shooter games such as doom and bf1 in this monitor
or it locked at 144hz all the time and i can't down to 60hz ?


----------



## boredgunner

Quote:


> Originally Posted by *mmms*
> 
> can i down refresh rate to 60hz with rpg games such as the witcher 3 and up to 144hz with fps and shooter games such as doom and bf1 in this monitor
> or it locked at 144hz all the time and i can't down to 60hz ?


You can change the refresh rate at will, but if you have a modern NVIDIA graphics card, you might as well just keep it at 144 Hz and use G-SYNC (which syncs the refresh rate to your game's frame rate as long as frame rate is a lower value than refresh rate).


----------



## mmms

Quote:


> Originally Posted by *boredgunner*
> 
> You can change the refresh rate at will, but if you have a modern NVIDIA graphics card, you might as well just keep it at 144 Hz and use G-SYNC (which syncs the refresh rate to your game's frame rate as long as frame rate is a lower value than refresh rate).


thanks bro
what do u mean by this :
but if you have a modern NVIDIA graphics card, you might as well just keep it at 144 Hz and use G-SYNC (which syncs the refresh rate to your game's frame rate as long as frame rate is a lower value than refresh rate).
i'm sorry , i'm arabic gamer from egypt and i don't know english perfectly .


----------



## DVLux

Quote:


> Originally Posted by *KGPrime*
> 
> some crap glowy Ips pos


Pretty sure this is an even glowier IPS PoS. Now with individual backlights that will die at different times!


----------



## Vipu

Quote:


> Originally Posted by *mmms*
> 
> thanks bro
> what do u mean by this :
> but if you have a modern NVIDIA graphics card, you might as well just keep it at 144 Hz and use G-SYNC (which syncs the refresh rate to your game's frame rate as long as frame rate is a lower value than refresh rate).
> i'm sorry , i'm arabic gamer from egypt and i don't know english perfectly .


You dont have to lower refresh rate never, more is always better.


----------



## mmms

Quote:


> Originally Posted by *boredgunner*
> 
> You can change the refresh rate at will, but if you have a modern NVIDIA graphics card, you might as well just keep it at 144 Hz and use G-SYNC (which syncs the refresh rate to your game's frame rate as long as frame rate is a lower value than refresh rate).


thanks bro
can u explain
Quote:


> Originally Posted by *Vipu*
> 
> You dont have to lower refresh rate never, more is always better.


I won't lower refresh rate all the time , i mean in rpg games only . it's great with 60hz +ips+G-sync .


----------



## boredgunner

Quote:


> Originally Posted by *mmms*
> 
> thanks bro
> can u explain
> I won't lower refresh rate all the time , i mean in rpg games only . it's great with 60hz +ips+G-sync .


Just keep it at 144 Hz, and I suggest doing research on G-SYNC.
Quote:


> Originally Posted by *DVLux*
> 
> Pretty sure this is an even glowier IPS PoS. Now with individual backlights that will die at different times!


I don't expect this monitor to have significantly less glow than other IPS, but it won't have more glow than 144 Hz AHVA that's for sure. For those saying FALD will significantly lower glow, I'm not so sure. Llowering brightness all the way on my XB270HU does not have a profound impact on IPS glow.


----------



## Vipu

Quote:


> Originally Posted by *mmms*
> 
> I won't lower refresh rate all the time , i mean in rpg games only . it's great with 60hz +ips+G-sync .


But why do you want to lower to 60hz?


----------



## mmms

Quote:


> Originally Posted by *Vipu*
> 
> But why do you want to lower to 60hz?


I've seen many reviews which consider 60hz is enough for rpg games such as the witcher 3 and 144hz is enough for shooter games .
At this point , i think Acer xb321hk is enough for both shooter and rpg games with 4k 32" ips+G-sync .


----------



## boredgunner

Quote:


> Originally Posted by *mmms*
> 
> I've seen many reviews which consider 60hz is enough for rpg games such as the witcher 3 and 144hz is enough for shooter games .
> At this point , i think Acer xb321hk is enough for both shooter and rpg games with 4k 32" ips+G-sync .


PG27UQ will offer much better image quality than the XB321HK.

Also, refresh rate/frame rate isn't game dependent. 60 FPS/60 Hz simply isn't enough to accurately replicate smooth motion. Higher refresh rate/frame rate is always superior.


----------



## KGPrime

Quote:


> Originally Posted by *DVLux*
> 
> Pretty sure this is an even glowier IPS PoS. Now with individual backlights that will die at different times!


Well from the videos of it at least there appears to be way less glow than the monitor next to it. I'd say it seems fairly significant.




And as far as backlights dying. It is what it is. It can also happen on a strip LEd monitor. And i'm trying to hold in any negativity. I don't want to jinx it before it comes out


----------



## boredgunner

^^^ Wow! Looks like a truly negligible amount of IPS glow. Now I want. If it's as good as it looks in those pictures (or better) then screw VA. But too bad it's not glossy...


----------



## DVLux

Looks more like they jacked up the brightness on the other monitor... Regardless, you do know they are going to pick the best looking one to show the monitor off, right?


----------



## boredgunner

Quote:


> Originally Posted by *DVLux*
> 
> Looks more like they jacked up the brightness on the other monitor... Regardless, you do know they are going to pick the best looking one to show the monitor off, right?


Yeah that's why I included that second to last sentence in my previous post.

Also my XB270HU glow at 18 brightness is similarly bad at such a steep angle. A black screen (pretty much the whole screen too) turns bright grey.


----------



## KGPrime

Quote:


> Originally Posted by *DVLux*
> 
> Looks more like they jacked up the brightness on the other monitor... Regardless, you do know they are going to pick the best looking one to show the monitor off, right?


You would think they would, but no, it's not a given at all. They don't always send out cherry picked samples to reviewers either, in fact depending, they likely send out the same beat up sample that's been to a handful of other reviewers and bounced across the continent a half dozen times. For trade shows, it's either a prototype, ( which could be inherently half assed ) or it's a sample from assembly that they had to have shipped to sales, because sales and promo aren't generally connected to the production line or the warehouse in which boxed products are stored to be opening boxes cherry picking valuable product.

It also perhaps lets them to say, this is what it is. Therefore if anyone "expected" anything different they can say they never claimed or showed anything but what it actually is. And perhaps that's smarter than claiming perfection. The fact that manufacturers say that some bleed and some glow or dead pixels is "acceptable" and part of the technology of your 800-1000 dollar panel kind of alludes to that.

I'm sure a behemoth company like LG when promoting a new flagship Oled 8k curved 100" blah blah certainly will be demoing the cream of the crop. Because it's a huge reveal and a much larger audience. A much larger piece of news and a much larger picture on Cnet or whatever, while your gaming PC monitor, or business class get's a blurb and a promo shot. They don't sell numbers like the newest TV's do, and profit margins are actually pretty thin, and business class monitors are which businesses buy in bulk, no one gives a crap if they have glow because they are using them for work under fluorescent lights all day with cup o noodles splatter all over them..

Also if you've worked any amount of years in your life, especially around sales, and or promo departments and or the types of people that work in them, you'll find most of them don't have a clue, or don't give the slightest crap about the type of stuff that actual "users" do and things like discussed on this forum. They are like used car salesman. You could even point it out to them and they wouldn't see it, or wouldn't care, and they are so self absorbed they think that everyone else must think the way they do. You'd be an annoyance if anything to these types of people. They aren't the engineer (artist) that probably started with love of his craft and then was brow beat to suck it and consider the bottom line by the bean counters and build something half assed to what they envisioned. Especially in a larger company where the right hand doesn't know what the left hand is doing adn it;s really about the bottom line and the share holders.

I work for a small company where we send product to events and showrooms at least twice a year, and there will be times the actual president of the company is on the floor hovering over everyone, barking about the slightest little detail, because he's going to be there himself standing next to it, and so everyone has that mentality it has to be perfect. Well other times he comes down for other things while people are crapping themselves trying to get it perfect and he's like hurry up, it's fine, box it up, and there's crap all over it that's shoddy, lol. It really comes down to who the client is, what the event or product is and how much he stands to gain from it.

Think about any company you've worked for. The chain of command. The ****** supervisors with negative attitudes pissed off at ex wives, or how much they lost at the casino, getting yelled at by their boss constantly about the bottom line and then taking it out on the pissant beneath them. You'd be putting far too much faith in them thinking they actually care enough to bother cherry picking anything. And trade shows and the weeks before them are generally chaos for most companies. It's a feat half the time everything gets there in one piece at all.

Edit: Some of this is opinion, some of it life experience, and some of it thoughts based on logic of said life experience. If anyone knows first hand otherwise or works for
Samesung ( 30 Rock reference ) or Acer or Asoos, please elaborate and or correct.


----------



## pez

Quote:


> Originally Posted by *Dragonsyph*
> 
> OH man first thing i can notice is the higher refresh rate, going from 60 to 100 i could instantly notice the difference. OC to 100 and past the test, has very little back light bleed. Getting 70-100 fps in bf1 max settings, think im cpu limited in bf1 because stock vs OC 1080 gets about the same fps.
> 
> You know of any guides for best settings on the x34? Haven't messed with anything yet besides OC.


Unfortunately I don't







. I was running the x34 with the Titan for the longest time, so the most I was doing was cutting AA or just using FXAA here and there as it was needed. All things considered, I wasn't terribly happy with the performance of the 1080 + x34. This is more personal preference, though, as I didn't wish to cut much more than AA in most titles and still wanted to take advantage of the higher refresh rate (i.e. running things as close to or higher than 100fps). In that aspect, the Titan spoiled me. I think keeping AA lower and texture details and lighting options in the medium-to-high range, in most games you'll see a nice bump in FPS and very negligible difference in image quality.

At this time, the GTX 1080 + 16:9 1440p perform extremely similar to the Titan + 21:9 1440p. Because the noise is low on the ACX 3.0 and I get the performance I desire out of the 16:9 panel, I've decided to go this route based on my personal preferences. Plus I get 144hz. While not as drastic as the move from 60 > 100, it's still noticeable and sometimes even addicting depending on the game.


----------



## The Robot

Quote:


> Originally Posted by *KGPrime*
> 
> You would think they would, but no, it's not a given at all. They don't always send out cherry picked samples to reviewers either, in fact depending, they likely send out the same beat up sample that's been to a handful of other reviewers and bounced across the continent a half dozen times. For trade shows, it's either a prototype, ( which could be inherently half assed ) or it's a sample from assembly that they had to have shipped to sales, because sales and promo aren't generally connected to the production line or the warehouse in which boxed products are stored to be opening boxes cherry picking valuable product.
> 
> It also perhaps lets them to say, this is what it is. Therefore if anyone "expected" anything different they can say they never claimed or showed anything but what it actually is. And perhaps that's smarter than claiming perfection. The fact that manufacturers say that some bleed and some glow or dead pixels is "acceptable" and part of the technology of your 800-1000 dollar panel kind of alludes to that.
> 
> I'm sure a behemoth company like LG when promoting a new flagship Oled 8k curved 100" blah blah certainly will be demoing the cream of the crop. Because it's a huge reveal and a much larger audience. A much larger piece of news and a much larger picture on Cnet or whatever, while your gaming PC monitor, or business class get's a blurb and a promo shot. They don't sell numbers like the newest TV's do, and profit margins are actually pretty thin, and business class monitors are which businesses buy in bulk, no one gives a crap if they have glow because they are using them for work under fluorescent lights all day with cup o noodles splatter all over them..
> 
> Also if you've worked any amount of years in your life, especially around sales, and or promo departments and or the types of people that work in them, you'll find most of them don't have a clue, or don't give the slightest crap about the type of stuff that actual "users" do and things like discussed on this forum. They are like used car salesman. You could even point it out to them and they wouldn't see it, or wouldn't care, and they are so self absorbed they think that everyone else must think the way they do. You'd be an annoyance if anything to these types of people. They aren't the engineer (artist) that probably started with love of his craft and then was brow beat to suck it and consider the bottom line by the bean counters and build something half assed to what they envisioned. Especially in a larger company where the right hand doesn't know what the left hand is doing adn it;s really about the bottom line and the share holders.
> 
> I work for a small company where we send product to events and showrooms at least twice a year, and there will be times the actual president of the company is on the floor hovering over everyone, barking about the slightest little detail, because he's going to be there himself standing next to it, and so everyone has that mentality it has to be perfect. Well other times he comes down for other things while people are crapping themselves trying to get it perfect and he's like hurry up, it's fine, box it up, and there's crap all over it that's shoddy, lol. It really comes down to who the client is, what the event or product is and how much he stands to gain from it.
> 
> Think about any company you've worked for. The chain of command. The ****** supervisors with negative attitudes pissed off at ex wives, or how much they lost at the casino, getting yelled at by their boss constantly about the bottom line and then taking it out on the pissant beneath them. You'd be putting far too much faith in them thinking they actually care enough to bother cherry picking anything. And trade shows and the weeks before them are generally chaos for most companies. It's a feat half the time everything gets there in one piece at all.
> 
> Edit: Some of this is opinion, some of it life experience, and some of it thoughts based on logic of said life experience. If anyone knows first hand otherwise or works for
> Samesung ( 30 Rock reference ) or Acer or Asoos, please elaborate and or correct.


Yeah, this is all pretty sad. I think we need some small independent monitor maker like Overlord Computers that just gives people what they want without all the bull. Today everyone is just copying what the other guy is doing. First Asus, then Acer, then Viewsonic, same thing.


----------



## ILoveHighDPI

Quote:


> Originally Posted by *mmms*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vipu*
> 
> But why do you want to lower to 60hz?
> 
> 
> 
> I've seen many reviews which consider 60hz is enough for rpg games such as the witcher 3 and 144hz is enough for shooter games .
> At this point , i think Acer xb321hk is enough for both shooter and rpg games with 4k 32" ips+G-sync .
Click to expand...

Artificial framerate limits are used to avoid frame tearing when you use a monitor that only runs at one set refresh rate, but with G-Sync the monitor inherently stays synchronized with the framerate of the game, so there is never a need to artificially limit the framerate.


----------



## Dragonsyph

Which of these is what>? Is backlight bleed the white or yellow coming through, and is the IPS glow the yellow or the white light coming through?


----------



## pez

Quote:


> Originally Posted by *Dragonsyph*
> 
> Which of these is what>? Is backlight bleed the white or yellow coming through, and is the IPS glow the yellow or the white light coming through?


BLB is usually what you see coming from the edges of the panel. IPS glow is usually a larger area of brightness that you can see on a larger portion/area of the screen.

IPS Glow:
http://i.imgur.com/nTeewPdl.jpg

BLB:
http://d2rormqr1qwzpz.cloudfront.net/uploads/0/1761/8262-150013174_062eba9de9_z.jpg


----------



## ToTheSun!

Quote:


> Originally Posted by *boredgunner*
> 
> ^^^ Wow! Looks like a truly negligible amount of IPS glow. Now I want. If it's as good as it looks in those pictures (or better) then screw VA. But too bad it's not glossy...


To be fair, VA will never have that amount of blooming.

We need to convince TPVision to mass produce some semi-glossy 100+ Hz VA displays!


----------



## DVLux

Quote:


> Originally Posted by *KGPrime*
> 
> You would think they would, but no, it's not a given at all. They don't always send out cherry picked samples to reviewers either, in fact depending, they likely send out the same beat up sample that's been to a handful of other reviewers and bounced across the continent a half dozen times. For trade shows, it's either a prototype, ( which could be inherently half assed ) or it's a sample from assembly that they had to have shipped to sales, because sales and promo aren't generally connected to the production line or the warehouse in which boxed products are stored to be opening boxes cherry picking valuable product.
> 
> .


It's 3-4 months~ till product release... If this is some god-awful prototype that barely works... Well... I've got a country to sell you.


----------



## KGPrime

Quote:


> Originally Posted by *DVLux*
> 
> It's 3-4 months~ till product release... If this is some god-awful prototype that barely works... Well... I've got a country to sell you.


I think i made it pretty clear no one is going to sell me anything. I'm the original LCD hater.








Let's hope we're wrong.


----------



## sl4ppy

Quote:


> Originally Posted by *DVLux*
> 
> It's 3-4 months~ till product release... If this is some god-awful prototype that barely works... Well... I've got a country to sell you.


"Q3" in Asus-speak means 2018. It was at least 6 months between the stated release date of the PG348Q and people actually being able to buy it.


----------



## CallsignVega

Quote:


> Originally Posted by *sl4ppy*
> 
> "Q3" in Asus-speak means 2018. It was at least 6 months between the stated release date of the PG348Q and people actually being able to buy it.


Yup, Asus is very slow on displays. I expect the Acer version around Q3/Q4, the Asus in 2018.


----------



## Clukos

The video from Linus is very promising, very limited, if any at all, blb (night and day difference with the ips panel right beside it)


























If that ends up being the case for 90%+ of these monitors i might end up getting one. Do we know if the HDMI port is 2.0 or 2.1? I assume they'll support HDR through HDMI as well but i haven't read anything about it. Also, do we know if the Acer equivalent will be using FALD as well or is it exclusive to this monitor?


----------



## boredgunner

We need a glossy version.


----------



## juano

Quote:


> Originally Posted by *Clukos*
> 
> The video from Linus is very promising, very limited, if any at all, blb (night and day difference with the ips panel right beside it)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If that ends up being the case for 90%+ of these monitors i might end up getting one. Do we know if the HDMI port is 2.0 or 2.1? I assume they'll support HDR through HDMI as well but i haven't read anything about it. Also, do we know if the Acer equivalent will be using FALD as well or is it exclusive to this monitor?


I don't have a source right in front of me, but Acer appears to be using the same panel, they've confirmed at least the same specs of 384 FALD zones, 1,000nit brightness etc.


----------



## Clukos

Quote:


> Originally Posted by *juano*
> 
> I don't have a source right in front of me, but Acer appears to be using the same panel, they've confirmed at least the same specs of 384 FALD zones, 1,000nit brightness etc.


Nice! It's always better to have more options


----------



## subtec

If only the 32" ProArt version supported higher refresh rates and Free-/G-sync, that would be a winner. As is, [email protected]" and the "gamer" looks of this thing put me off.


----------



## un1b4ll

Quote:


> Originally Posted by *DVLux*
> 
> It's 3-4 months~ till product release... If this is some god-awful prototype that barely works... Well... I've got a country to sell you.


Did Asus burn your village and murder your family or something?


----------



## DVLux

Quote:


> Originally Posted by *un1b4ll*
> 
> Did Asus burn your village and murder your family or something?


M$py told NGreedia that an AMD Freesync Rebellion was starting in my village... So Lord Huang Di Leather Jacket torched the whole village.

...But that's besides the point.







The over-arching point is that just cause it looked uber fantastic at CES... Doesn't meant the one you order will be as immaculate. I mean... If you want to set yourself up for that kind of let down, like everyone in the 144Hz IPS clubs did when the models were announced, then be my guest?


----------



## boredgunner

Quote:


> Originally Posted by *DVLux*
> 
> M$py told NGreedia that an AMD Freesync Rebellion was starting in my village... So Lord Huang Di Leather Jacket torched the whole village.
> 
> ...But that's besides the point.
> 
> 
> 
> 
> 
> 
> 
> The over-arching point is that just cause it looked uber fantastic at CES... Doesn't meant the one you order will be as immaculate. I mean... If you want to set yourself up for that kind of let down, like everyone in the 144Hz IPS clubs did when the models were announced, then be my guest?


Backlight bleed shouldn't be much of a problem at least, due to the full array design. But I do expect a panel lottery nonetheless. Dead pixels galore, dust specks and what not. I wonder if IPS glow is truly eliminated by the FALD though. Based on those photos it is, but like you said it might be a cherry picked sample.


----------



## BoredErica

I don't want to spend $500+ on a monitor and not have it tick all the boxes.

...Boxes like a glossy panel or a non-idiotic QC process. And now with 4k I'm given a resolution which is challenging on 27 inches or a monitor larger than I want. And in both cases, a graphics card that can't drive the games I want. gr8 m8


----------



## un1b4ll

Quote:


> Originally Posted by *DVLux*
> 
> M$py told NGreedia that an AMD Freesync Rebellion was starting in my village... So Lord Huang Di Leather Jacket torched the whole village.
> 
> ...But that's besides the point.
> 
> 
> 
> 
> 
> 
> 
> The over-arching point is that just cause it looked uber fantastic at CES... Doesn't meant the one you order will be as immaculate. I mean... If you want to set yourself up for that kind of let down, like everyone in the 144Hz IPS clubs did when the models were announced, then be my guest?


We need Spielberg to make that movie. I'd totally watch it.

But what about those of us that have 144hz IPS and are totally happy with it?


----------



## sblantipodi

I am planning to buy this monitor, but what is the connection needed to support 4K at 144Hz with GSYNC?

The new GTX1080Ti uses "only" a HDMI 2.0b, is this HDMI sufficient for such a huge resolution/hertz?
If not, should we use display port 1.4?

Any news on this monitor?


----------



## Benny89

If only it was 32" or 40".... I'd grab that and be settled for 2018.

27 is a joke on 4K. We need bigger screens for 4K!


----------



## jezzer

Hopefully there will be 1440p version too, now that there is a true 1440p 100+ hrz gpu like the 1080 Ti, HDR will be a bigger jump than 4K


----------



## CallsignVega

No, only DP 1.4 can handle such bandwidth until HDMI 2.1 comes out.


----------



## PostalTwinkie

Quote:


> Originally Posted by *subtec*
> 
> If only the 32" ProArt version supported higher refresh rates and Free-/G-sync, that would be a winner. As is, [email protected]" and the "gamer" looks of this thing put me off.


Acer will likely have you covered on the less "Gamer" looking of the displays. Even though they are named Predator displays they are still pretty understated by comparison, mine isn't too offensive.


----------



## tygeezy

This monitor will look phenomenal at 27 inches. My word the pixel density. 1080 P won't look to shabby either since the pixels are 1:1. It will pretty much be the 1080 p monitors at 27 inch that have been sold in the past. So for games where you need the ultra high framerate it wouldn't be a bad option to dial down to 1080 p since there is no way in hell you're getting 140 frames per second in a lot of shooters. The gsync will be so nice for 4k where you can put a framerate cap at something like 40 fps and have a very smooth gaming experience with consistent frametimes and gorgeous graphics.


----------



## tygeezy

Quote:


> Originally Posted by *jezzer*
> 
> Hopefully there will be 1440p version too, now that there is a true 1440p 100+ hrz gpu like the 1080 Ti, HDR will be a bigger jump than 4K


While HDR is outstanding, I disagree on it being a bigger leap than 4k. Also, HDR tends to add input lag judging by what I have seen from HDR enabled televisions. Gaming monitors tend to have better overall input lag than general purpose televisions, so it shouldn't be as bad. We are more sensitive to input lag though with mice being so precise in comparison to gamepads.


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> No, only DP 1.4 can handle such bandwidth until HDMI 2.1 comes out.


This is the answer. This monitor is DisplayPort 1.4.
Quote:


> Originally Posted by *jezzer*
> 
> Hopefully there will be 1440p version too, now that there is a true 1440p 100+ hrz gpu like the 1080 Ti, HDR will be a bigger jump than 4K


1440p is so 2015. It was always just a stepping stone until flagship GPUs could handle 4k. Good riddance, 1440p.

HDR is only supported by maybe four PC games. Even in those games it won't be a bigger leap in visual fidelity.
Quote:


> Originally Posted by *tygeezy*
> 
> This monitor will look phenomenal at 27 inches. My word the pixel density. 1080 P won't look to shabby either since the pixels are 1:1. It will pretty much be the 1080 p monitors at 27 inch that have been sold in the past. So for games where you need the ultra high framerate it wouldn't be a bad option to dial down to 1080 p since there is no way in hell you're getting 140 frames per second in a lot of shooters. The gsync will be so nice for 4k where you can put a framerate cap at something like 40 fps and have a very smooth gaming experience with consistent frametimes and gorgeous graphics.


Well, current G-SYNC monitors don't support resolution scaling at all, the GPU has to do it. If that's the case here then wouldn't 1080p look like crap still?

The pixel density is nice, but I would rather have 32". Oh well, can't get it all. I'd rather have glossy too.


----------



## Malinkadink

It's almost strange that they went with a 4k 144hz HDR before even doing 1440p 144hz HDR. You can actually do 240hz SDR on 1440p and 170Hz HDR. They're most likely going to be using x2 DP 1.4 ports just to run the monitor at 120hz and then display stream compression to get the rest of the way to 144hz with HDR enabled. Apparently VESA's DSC is visually lossless, but still i don't like the word compression









EDIT:

From the wiki page:

DisplayPort version 1.4 was published March 1, 2016.[21] No new transmission modes are defined, so HBR3 (32.4 Gbit/s) as introduced in version 1.3 still remains as the highest available mode. DisplayPort 1.4 adds support for Display Stream Compression 1.2 (DSC), Forward Error Correction, HDR10 extension defined in CTA-861.3, the Rec. 2020 color space, and extends the maximum number of inline audio channels to 32.[22]

DSC is a "visually lossless" encoding technique with up to 3:1 compression ratio.[21] Using DSC with HBR3 transmission rates, DisplayPort 1.4 can support 8K UHD (7680×4320) at 60 Hz with 10-bit color and HDR, or 4K UHD (3840×2160) at 120 Hz with 10-bit color and HDR. 4K at 60 Hz with 10-bit color and HDR can be achieved without the need for DSC. On displays which do not support DSC, the maximum limits are unchanged from DisplayPort 1.3 (4K 120 Hz, 5K 60 Hz, 8K 30 Hz).[23]

So DP 1.3 and DP 1.4 have the same bandwidth constraints. 1.4 just introduces DSC 1.2. A single DP 1.4 using DSC will be capable of running 144hz 4k SDR, but you will need that second DP to get 144hz HDR.


----------



## CallsignVega

It will only be a single cable, so I believe the panel will run via the G-Sync chip DP 1.4 at 4K 144 Hz HDR at 8-Bit+FRC color space.


----------



## Malinkadink

Quote:


> Originally Posted by *CallsignVega*
> 
> It will only be a single cable, so I believe the panel will run via the G-Sync chip DP 1.4 at 4K 144 Hz *HDR at 8-Bit+FRC color space*.


How lame, what a crutch.


----------



## KGPrime

Flanders Scientific said this years ago about FRC on their their multi thousand dollar industrial field and broadcast monitors.

*"FSI Broadcast Monitor General FAQs: I see that the LM-2461W has a 10 bit (FRC) LCD panel, what does this mean?*

The LM-2461W is capable of a 10 bit color depth by utilizing an 8 bit panel driver along with the latest built in FRC algorithm. This allows you to accurately reproduce approximately 1.073 Billion Colors. This advanced FRC is NOT spatial dithering and practically speaking an 8 bit panel with FRC will be mostly indiscernible from a native 10 bit LCD monitor with respect to bit depth. With respect to a discernible bit depth advantage a native 10 bit panel may have slightly less noise in extreme lowlights, but this is a very marginal difference that is typically not noticeable outside of very specific test patterns.

There is a misconception that 8bit with FRC may generate unacceptable artifacts for professional video monitors. Modern day FRC algorithms provide very accurate color reproduction and will actually produce significantly less artifacts when viewing a 10 bit video signal than an 8 bit panel without FRC, which will show some degree of banding and exponentially fewer colors. Native 10 bit panels are of course the best solution, but the price premium is quite significant for a very marginal improvement in performance. 8 bit + FRC is becoming the new norm in many high-end consumer and professional displays. The difference between 8 bit monitors with advanced FRC and native 10 bit monitors is so negligible that many manufacturers now simply refer to both technologies as 10 bit. In the interest of complete clarity and full disclosure FSI will continue to list native 10 bit panels simply as 10 bit and panels with FRC as 10 bit (FRC)."


----------



## DesmoLocke

These are the quality posts I come to OCN for.


----------



## Sempre

Quote:


> Originally Posted by *KGPrime*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Flanders Scientific said this years ago about FRC on their their multi thousand dollar industrial field and broadcast monitors.
> 
> *"FSI Broadcast Monitor General FAQs: I see that the LM-2461W has a 10 bit (FRC) LCD panel, what does this mean?*
> 
> The LM-2461W is capable of a 10 bit color depth by utilizing an 8 bit panel driver along with the latest built in FRC algorithm. This allows you to accurately reproduce approximately 1.073 Billion Colors. This advanced FRC is NOT spatial dithering and practically speaking an 8 bit panel with FRC will be mostly indiscernible from a native 10 bit LCD monitor with respect to bit depth. With respect to a discernible bit depth advantage a native 10 bit panel may have slightly less noise in extreme lowlights, but this is a very marginal difference that is typically not noticeable outside of very specific test patterns.
> 
> There is a misconception that 8bit with FRC may generate unacceptable artifacts for professional video monitors. Modern day FRC algorithms provide very accurate color reproduction and will actually produce significantly less artifacts when viewing a 10 bit video signal than an 8 bit panel without FRC, which will show some degree of banding and exponentially fewer colors. Native 10 bit panels are of course the best solution, but the price premium is quite significant for a very marginal improvement in performance. 8 bit + FRC is becoming the new norm in many high-end consumer and professional displays. The difference between 8 bit monitors with advanced FRC and native 10 bit monitors is so negligible that many manufacturers now simply refer to both technologies as 10 bit. In the interest of complete clarity and full disclosure FSI will continue to list native 10 bit panels simply as 10 bit and panels with FRC as 10 bit (FRC)."


Thank you for this


----------



## MonarchX

How exactly are they planning to make an IPS screen have such awesome contrast ratio and HDR support? I assume they will also use zones like on high-end VA TV's?

EDIT: BOO! It will use only 384 zones to provide the uber contrast ratio, which means halo effect. You literally need over 3000+ zones to produce an image anywhere near as good as OLED.

I would not buy this and just wait for OLED, which is ramping out quite nicely. I'd also rather get a 4K TV than 27" 4K monitor...


----------



## boredgunner

Quote:


> Originally Posted by *MonarchX*
> 
> How exactly are they planning to make an IPS screen have such awesome contrast ratio and HDR support? I assume they will also use zones like on high-end VA TV's?
> 
> EDIT: BOO! It will use only 384 zones to provide the uber contrast ratio, which means halo effect. You literally need over 3000+ zones to produce an image anywhere near as good as OLED.
> 
> I would not buy this and just wait for OLED, which is ramping out quite nicely. I'd also rather get a 4K TV than 27" 4K monitor...


384 zones across 27" should be less offensive than 384 zones across 50" at least. Once a monitor like this (or a VA equivalent) is $1500 or less, I'll get it until there is an OLED monitor with similar features.


----------



## rvectors

Quote:


> Originally Posted by *MonarchX*
> 
> How exactly are they planning to make an IPS screen have such awesome contrast ratio and HDR support? I assume they will also use zones like on high-end VA TV's?
> 
> EDIT: BOO! It will use only 384 zones to provide the uber contrast ratio, which means halo effect. You literally need over 3000+ zones to produce an image anywhere near as good as OLED.
> 
> I would not buy this and just wait for OLED, which is ramping out quite nicely. I'd also rather get a 4K TV than 27" 4K monitor...


Yes the TV market has OLED as being a bit more mainstream but the pc monitor market, is basically fubar, so the comparison sounds more to do with a preference. I'm comparing it to what we've been stuck with for 10-15 years, and the ASUS (and the missing in action ACER) HDR monitors, look really great* compared to what's available now. Of course, we need more than a few CES videos, to get a true picture, no pun intended.

* i thought I detected some halo effects but I'd accept that for no BLB and bairly perceptible glow.


----------



## MonarchX

Too bad even a single 1080 Ti cannot do 4K at 60fps in most games. Hell, GTX 1080 with good OC cannot do consistent 60fps at 1080p in GTA 5 all maxed out with ENB MSAA enabled. I still have to lower grass quality to High in that game to get consistent smooth fluid motion. I will actually upgrade to 1080 Ti for 1080p gaming at higher FPS. I fell in love with Titanfall 2.

I also find resolution overrated. Look at how much better 4x OGSSAA makes 1080p games look, which means even 1080p has not been fully milked yet, let alone 1440p! IMHO, reduced aliasing is the greatest benefit of higher resolution, but more and more games now use TAA + post-process sharpening and/or 4K downscaling/4x SSAA to achieve aliasing-free picture. You really need at least 5K downscaling + TAA/FXAA, if not 8K (maybe even 16K!), to fully maximize 1080p to a point where no single (or visible) pixel stands out as a jaggy. With that said, true 4K/5K/8K/16K will always look better than 1080p with whichever level of downscaling.

G-Sync will help with low FPS at 4K, which is why I wonder for what reason have they decided to go with 144Hz... No rig can do that and it would make more sense to make a 60Hz panel with more zones than a 144Hz panel with fewer zones.

Halo effect from zones is extremely annoying, more annoying than low contrast ratio. It also makes it difficult to calibrate the screen properly. Besides, most zoned TV's are VA TV's that these days have native static ~4500:1 contrast ratio without the zones. The halo effect is already annoying when you see pitch black levels transition to light black levels of a VA display, but IPS displays have very low native static contrast ratio, which means way more pronounced halo effect as pitch black levels transition to light gray levels w/ addition of IPS glow, making halo effect very obvious. Now, a while back, Panasonic introduced new IPS technology prototype TV with fewer or same number of zones and it did pull off an excellent image without much halo effect (although it depends on the content viewed). If this monitor uses this new type of IPS technology that has relatively good native static contrast ratio, then there is hope.

Personally, I would rather add another grand or two and buy a good OLED HDR 4K TV for $2500-3000 than spend more than $750 on any monitor simply because TV's use higher quality panels and hardware. I don't think I will upgrade any of my displays until OLED, 4K, HDR, and Rec. 2020 become mainstream standards. Until then this early adoption feels like alpha/beta-testing hardware. People who buy 4K TV's now will have to get newer 4K TV's later when Rec. 2020 becomes possible and mainstream, although I think some Samsung model has already achieved it. It will be at least 2-3 years, if not more, before Rec. 2020 replaces Rec. 709 / sRGB, which games will continue to use for at those 2-3 years with or without HDR effects.


----------



## boredgunner

Quote:


> Originally Posted by *MonarchX*
> 
> Too bad even a single 1080 Ti cannot do 4K at 60fps in most games. Hell, GTX 1080 with good OC cannot do consistent 60fps at 1080p in GTA 5 all maxed out with ENB MSAA enabled. I still have to lower grass quality to High in that game to get consistent smooth fluid motion. I will actually upgrade to 1080 Ti for 1080p gaming at higher FPS. I fell in love with Titanfall 2.


You mean to say most 2015 and newer AAA games, not most games. Even so, for most of them one would only have to lower a few rather insignificant settings. For GTA 5, performance crushing ENB's would have to be avoided.
Quote:


> Originally Posted by *MonarchX*
> 
> I also find resolution overrated. Look at how much better 4x OGSSAA makes 1080p games look, which means even 1080p has not been fully milked yet, let alone 1440p! IMHO, reduced aliasing is the greatest benefit of higher resolution, but more and more games now use TAA + post-process sharpening and/or 4K downscaling/4x SSAA to achieve aliasing-free picture. You really need at least 5K downscaling + TAA/FXAA, if not 8K (maybe even 16K!), to fully maximize 1080p to a point where no single (or visible) pixel stands out as a jaggy. With that said, true 4K/5K/8K/16K will always look better than 1080p with whichevef level of downscaling.


You say resolution is overrated, and then praise OGSSAA which effectively multiplies the rendering resolution.

But native resolution (in this case 4k) will always be better than an equivalent supersampled one (in this case 1080p with 4x SSAA). SSAA will get you similar levels of aliasing but not overall clarity. Which you admitted yourself.

Besides, only a minority of modern games are compatible with SSAA. Resolution scale sliders are getting a bit more common, but most of them are just simpler downsampling methods that aren't as effective as SGSSAA or OGSSAA.
Quote:


> Originally Posted by *MonarchX*
> 
> G-Sync will help with low FPS at 4K, which is why I wonder why they decided to go with 144Hz... No rig can do that and it would make more sense to make a 60Hz panel with more zones than 144Hz panel with fewer zones. Halo effect from zones is extremely annoying, more annoying than low contrast ratio. It also makes it difficult to calibrate the screen properly. Besides, most zoned TV's are VA TV's that these days have native static ~4500:1 contrast ratio without the zones. The halo effect is already annoying when you see pitch black levels transition to light black levels of a VA display, but IPS displays have very low native static contrast ratio, which means way more pronounced halo effect as pitch black levels transition to light gray levels w/ addition of IPS glow making halo effect very obvious. Now a while back Panasonic new IPS tech prototype TV with fewer or same number of zones did pull off an excellent image without much halo effect (although it depends on the content viewed). If this monitor uses this new type of IPS technology that has relatively good native static contrast ratio, then there is hope.
> 
> Personally, I would rather add another grand or two and buy a good OLED HDR 4K TV for $2500-3000 than spend more than $750 on any monitor simply because TV's use higher quality panels and hardware. I don't think I will upgrade any of my displays until OLED, 4K, HDR, and Rec. 2020 standards become mainstream standards. Until then this early adoption feels like alpha/beta-testing hardware. People who buy 4K TV's now will have to get newer TV's when Rec. 2020 becomes possible, although I think some Samsung model has already achieved it.


You make some good points. I seriously doubt this monitor uses one of the higher end ~2000:1 IPS panels that some TVs use.

Personally I can make use of that high refresh rate since I play many older games. Most of us will also keep the monitor for longer than our GPUs, which means it provides room to grow.

Also, you'll be waiting forever if you're waiting for Rec.2020 to become standard. I don't think any entertainment content is designed for that color space, and I know of no displays that cover more than 70% of Rec.2020 or even 100% DCI-P3.

I don't think the display technology is there for Rec.2020 yet. White OLED/PCOLED won't suffice I'm guessing.


----------



## MonarchX

Quote:


> Originally Posted by *boredgunner*
> 
> You mean to say most 2015 and newer AAA games, not most games. Even so, for most of them one would only have to lower a few rather insignificant settings. For GTA 5, performance crushing ENB's would have to be avoided.
> You say resolution is overrated, and then praise OGSSAA which effectively multiplies the rendering resolution.
> 
> But native resolution (in this case 4k) will always be better than an equivalent supersampled one (in this case 1080p with 4x SSAA). SSAA will get you similar levels of aliasing but not overall clarity. Which you admitted yourself.
> 
> Besides, only a minority of modern games are compatible with SSAA. Resolution scale sliders are getting a bit more common, but most of them are just simpler downsampling methods that aren't as effective as SGSSAA or OGSSAA.
> You make some good points. I seriously doubt this monitor uses one of the higher end ~2000:1 IPS panels that some TVs use.
> 
> Personally I can make use of that high refresh rate since I play many older games. Most of us will also keep the monitor for longer than our GPUs, which means it provides room to grow.
> 
> Also, you'll be waiting forever if you're waiting for Rec.2020 to become standard. I don't think any entertainment content is designed for that color space, and I know of no displays that cover more than 70% of Rec.2020 or even 100% DCI-P3.
> 
> I don't think the display technology is there for Rec.2020 yet. White OLED/PCOLED won't suffice I'm guessing.


About 1080p - my point was that most people dismiss it as a high-end resolution without taking downacaling (to achieve its full visual potential) into consideration. Comparing native 1080p display with native 1080p image to native 4K display with native 4K image would result in greater visual difference than if 4K image was downscaled on 1080p display.

As far as Rec. 2020 goes - we have no idea when displays will be able to handle it. I did not expect 4K and/or HDR and/or OLED to become so popular so fast. IMHO, display technogies are pushing ahead of adoption and and we may see Rec. 2020 capable displays in 2017. Professional mastering studios surely have displays that can do Rec. 2020.


----------



## Asmodian

Quote:


> Originally Posted by *MonarchX*
> 
> As far as Rec. 2020 goes - we have no idea when displays will be able to handle it. I did not expect 4K and/or HDR and/or OLED to become so popular so fast. IMHO, display technogies are pushing ahead of adoption and and we may see Rec. 2020 capable displays in 2017. Professional mastering studios surely have displays that can do Rec. 2020.


No, there are no displays in existence, for any price, that can display 100% Rec. 2020. Rec. 2020 is not meant to be a display gamut, it is an encoding color space that can encompass pretty much any possible real display. This allows encoding HDR and wide gamut content which can be mapped to any future display but there will always (at least in the foreseeable future) have to be a conversion to the display's native gamut when displaying Rec. 2020. No display can map Rec. 2020's 100% red, green, or blue, to its 100% red, green, or blue.


----------



## ToTheSun!

Quote:


> Originally Posted by *boredgunner*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MonarchX*
> 
> How exactly are they planning to make an IPS screen have such awesome contrast ratio and HDR support? I assume they will also use zones like on high-end VA TV's?
> 
> EDIT: BOO! It will use only 384 zones to provide the uber contrast ratio, which means halo effect. You literally need over 3000+ zones to produce an image anywhere near as good as OLED.
> 
> I would not buy this and just wait for OLED, which is ramping out quite nicely. I'd also rather get a 4K TV than 27" 4K monitor...
> 
> 
> 
> 384 zones across 27" should be less offensive than 384 zones across 50" at least. Once a monitor like this (or a VA equivalent) is $1500 or less, I'll get it until there is an OLED monitor with similar features.
Click to expand...

The problem with comparing dimming zones between an IPS monitor and a large VA TV is that, while the latter will be less accurate per unit of area, blooming will be much more offensive on the former. While blooming is impossible to supress, the difference in luminance is much less drastic on a VA TV.


----------



## MonarchX

Quote:


> Originally Posted by *Asmodian*
> 
> No, there are no displays in existence, for any price, that can display 100% Rec. 2020. Rec. 2020 is not meant to be a display gamut, it is an encoding color space that can encompass pretty much any possible real display. This allows encoding HDR and wide gamut content which can be mapped to any future display but there will always (at least in the foreseeable future) have to be a conversion to the display's native gamut when displaying Rec. 2020. No display can map Rec. 2020's 100% red, green, or blue, to its 100% red, green, or blue.


No display TODAY you mean, right?


----------



## boredgunner

Quote:


> Originally Posted by *ToTheSun!*
> 
> The problem with comparing dimming zones between an IPS monitor and a large VA TV is that, while the latter will be less accurate per unit of area, blooming will be much more offensive on the former. While blooming is impossible to supress, the difference in luminance is much less drastic on a VA TV.


Yes I'm aware of that as well. There is a video showcasing this monitor running the UE4 Infiltrator demo, some blooming can be seen but it's hard to determine precisely how bad from a video. I'll check it out at Microcenter when it's out.


----------



## MonarchX

Quote:


> Originally Posted by *boredgunner*
> 
> Yes I'm aware of that as well. There is a video showcasing this monitor running the UE4 Infiltrator demo, some blooming can be seen but it's hard to determine precisely how bad from a video. I'll check it out at Microcenter when it's out.


Link me up, Sco... BoredGunny!


----------



## boredgunner

This should be added to the OP.


----------



## Asmodian

Quote:


> Originally Posted by *MonarchX*
> 
> No display TODAY you mean, right?


Not really, if you look at the spec you can see the corners of its gamut touch the edges of human perception of color. This requires new lasers or some technology that generates very specific pure wavelengths at very high brightness. Much higher brightness than needed with a slightly smaller gamut. It is possible we will invent something that makes it seem attainable but natively displaying Rec. 2020 with any of the current ways we generate color images is pure science fiction.


----------



## boredgunner

Quote:


> Originally Posted by *Asmodian*
> 
> Not really, if you look at the spec you can see the corners of its gamut touch the edges of human perception of color. This requires new lasers or some technology that generates very specific pure wavelengths at very high brightness. Much higher brightness than needed with a slightly smaller gamut. It is possible we will invent something that makes it seem attainable but natively displaying Rec. 2020 with any of the current ways we generate color images is pure science fiction.


I don't think it's even that important to be honest. We need games that support greater color depth first and foremost to kill banding. True 12-bits preferably, and we need something better than DisplayPort 1.4. As for color space/volume, 100% coverage of DCI-P3 should be plenty. And regarding display technology, all I really want before I die is true "QLED" I suppose.


----------



## ryder

so essentially, at 27" and under, this is as good as it gets (to the human eye) in terms of gaming visuals (provided you have the horsepower to run it)?


----------



## boredgunner

Quote:


> Originally Posted by *ryder*
> 
> so essentially, at 27" and under, this is as good as it gets (to the human eye) in terms of gaming visuals (provided you have the horsepower to run it)?


As far as monitors go, this should be as good as it gets for gaming and watching movies/shows.


----------



## rvectors

For me [email protected] inch, is the sweet spot for density, head movement/viewing angles and icon size. I'm considering the 32 inch ASUS version when it comes out but to me, it looks comically large.... but but, then I'm still using my 10+ year old 17 inch dell, that has no dead pixels, glow or BLB. I did try a number of new ones but I don't need to go into my experiences, since most of us already know what we now have to put up with.

Like boredgunner says, I'm interested in a monitor that can last a few good years, one I'd be happy to run at slightly lower refresh if needed, and doesn't suffer the blight of the current pc monitor market. Compared to current IPS tech, this on paper looks worth the leap.


----------



## sblantipodi

Quote:


> Originally Posted by *rvectors*
> 
> For me [email protected] inch, is the sweet spot for density, head movement/viewing angles and icon size. I'm considering the 32 inch ASUS version when it comes out but to me, it looks comically large.... but but, then I'm still using my 10+ year old 17 inch dell, that has no dead pixels, glow or BLB. I did try a number of new ones but I don't need to go into my experiences, since most of us already know what we now have to put up with.
> 
> Like boredgunner says, I'm interested in a monitor that can last a few good years, one I'd be happy to run at slightly lower refresh if needed, and doesn't suffer the blight of the current pc monitor market. Compared to current IPS tech, this on paper looks worth the leap.


I agree, 32inches is ok for a TVs not for PC Monitors.
You can sit on a desk in front of a 32inch monitor and if you have a huge desk there is no reason to put the monitor so far just to justify the 32 inch.


----------



## CallsignVega

30 inch would have been perfect IMO, but I can live with 27 inch. (Throws cash at screen).


----------



## Seyumi

I'm coming from a 40" 4k monitor. I agree that 40" is too large. On "regular" sized desks from lets say office max / depot / ikea, etc. you have to physically move your head to see everything on the screen. Kind of hard to play when you can't see important UI elements such as health, bullets, mini map, mph, etc. You put yourself in a pretty big disadvantage even with single player games.

This monitor would have been perfect in a 32"~37" size but I guess you can never get everything you want. Just like the few people before said, this will pretty much be the best gaming monitor in existence at the current moment. I could have gone with a glossy or semi-glossy display as well but seems like that is more or less dead in the pc-monitor realm.


----------



## aberrero

Quote:


> Originally Posted by *Seyumi*
> 
> I'm coming from a 40" 4k monitor. I agree that 40" is too large. On "regular" sized desks from lets say office max / depot / ikea, etc. you have to physically move your head to see everything on the screen. Kind of hard to play when you can't see important UI elements such as health, bullets, mini map, mph, etc. You put yourself in a pretty big disadvantage even with single player games.
> 
> This monitor would have been perfect in a 32"~37" size but I guess you can never get everything you want. Just like the few people before said, this will pretty much be the best gaming monitor in existence at the current moment. I could have gone with a glossy or semi-glossy display as well but seems like that is more or less dead in the pc-monitor realm.


Cmon. It's not glossy? THE SEARCH CONTINUES.


----------



## Seyumi

Quote:


> Originally Posted by *aberrero*
> 
> Cmon. It's not glossy? THE SEARCH CONTINUES.


It's not. There's no such thing as a "gaming" glossy display. Especially not from Asus or Acer. I think I just need to accept that and move on. People wonder why Apple phones, laptops, tablets, and desktops are the most popular in the world. One major reason is their glossy and vibrant displays. Nothing beats the vibrancy from my old 27" Apple Cinema Display that I miss dearly. Guess everyone has to suffer a muddy & dull image because people are too lazy to control their room lighting or placement.

I'll be picking up one of these even though I'll be dropping from 40" to 27" and semi-glossy to matte. I'll be gaining 144hz from 60hz, g-sync, & hdr, and probably an overall better panel.


----------



## boredgunner

Quote:


> Originally Posted by *Seyumi*
> 
> It's not. There's no such thing as a "gaming" glossy display. Especially not from Asus or Acer. I think I just need to accept that and move on. People wonder why Apple phones, laptops, tablets, and desktops are the most popular in the world. One major reason is their glossy and vibrant displays. Nothing beats the vibrancy from my old 27" Apple Cinema Display that I miss dearly. Guess everyone has to suffer a muddy & dull image because people are too lazy to control their room lighting or placement.
> 
> I'll be picking up one of these even though I'll be dropping from 40" to 27" and semi-glossy to matte. I'll be gaining 144hz from 60hz, g-sync, & hdr, and probably an overall better panel.


It's silly. Give us a choice at least, like Overlord Computer did before they died off.


----------



## aberrero

I removed the anti glare layer from one of my old dell monitors and made it glossy, but I'm not sure I'm brave enough to take a razor to a $1500 panel.


----------



## KGPrime

I couldn't hate matte more as well and have only ever used glossy monitors ( crts basically) lol, but i am currently using an old gateway lcd because it is glossy. However the newer IPS like pg279q and Predator 1400p matte is tolerable enough. I would prefer it to be full glossy with ar coating without a doubt but if it has to be matte let it be that matte.


----------



## Vipu

Someone should make some tool to make screens glossy easily, like delidding tools etc.
There would be market for it I feel like!


----------



## CallsignVega

A good semi gloss is a nice compromise between full glossy and full matte. Full matte is a no-go, which hardly anyone uses anymore.


----------



## atomicmew

lol @ the price. 4K IPS freesyncs were going for ~$300-400 on BF. Why the heck would you pay 300% more just for 144 Hz?


----------



## zealord

Quote:


> Originally Posted by *atomicmew*
> 
> lol @ the price. 4K IPS freesyncs were going for ~$300-400 on BF. Why the heck would you pay 300% more just for 144 Hz?


because gsync is not freesync
and HDR
oh and 144hz which is not somthing many people would describe with the word "just"


----------



## boredgunner

Quote:


> Originally Posted by *atomicmew*
> 
> lol @ the price. 4K IPS freesyncs were going for ~$300-400 on BF. Why the heck would you pay 300% more just for 144 Hz?





















Did you even read about this monitor? Here is why it costs so much more:



144 Hz
G-SYNC (and probably ULMB too)
Full array local dimming with 384 dimming zones
HDR
Quantum dot
Wide gamut (DCI-P3 spec)
Plus price premium for being the first of its kind


----------



## CallsignVega

Going from 4K 60 Hz to 4K 144 Hz is basically worth its weight in gold. Let alone the amazing brightness/HDR and FALD.


----------



## xSociety

I would do very bad things to get this monitor.


----------



## naved777

But 4k at 27 inches.... Isn't it a bit small for 4k?


----------



## Oubadah

..


----------



## boredgunner

Quote:


> Originally Posted by *Oubadah*
> 
> No, different people have different priorities. Some people prefer quality (pixel density) over quantity (screen area).


This. Depends on personal preference and also sitting distance. I'd rather have 30-32" but whatever, not a massive difference.


----------



## KGPrime

Quote:


> Originally Posted by *Vipu*
> 
> Someone should make some tool to make screens glossy easily, like delidding tools etc.
> There would be market for it I feel like!












But seriously. Someone actually successfully did it on a Dell and posted it on this forum.


----------



## xSociety

Quote:


> Originally Posted by *naved777*
> 
> But 4k at 27 inches.... Isn't it a bit small for 4k?


I'd rather have a 27" than 32" TBH.

Having to use less AA, if at all, and for more competitive games 32" is far too big.


----------



## BoredErica

> Originally Posted by *boredgunner*
> 
> 1440p is so 2015. It was always just a stepping stone until flagship GPUs could handle 4k. Good riddance, 1440p.


But they still can't, really. And there are still problems with scaling or monitor sizes changing to work with 4k.



> Originally Posted by *Oubadah*
> 
> No, different people have different priorities. Some people prefer quality (pixel density) over quantity (screen area).


Some people just want 27 inches and not move their head from where it normally is when they use their computah.

At this point I'm wonder if I should just wait until some HDR monitor comes along, hopefully with less dodgy QC.


----------



## bigboy678

Quote:


> Originally Posted by *boredgunner*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you even read about this monitor? Here is why it costs so much more:
> 
> 
> 
> 144 Hz
> G-SYNC (and probably ULMB too)
> Full array local dimming with 384 dimming zones
> HDR
> Quantum dot
> Wide gamut (DCI-P3 spec)
> Plus price premium for being the first of its kind


Besides the normal gsync premium that full array local back light dimming is a good chunk of the cost (my bet). Thats why most led tv's are edge lit except for the high end models. Not cheap in the least to implement


----------



## ToTheSun!

Quote:


> Originally Posted by *Seyumi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aberrero*
> 
> Cmon. It's not glossy? THE SEARCH CONTINUES.
> 
> 
> 
> It's not. There's no such thing as a "gaming" glossy display. Especially not from Asus or Acer. I think I just need to accept that and move on. People wonder why Apple phones, laptops, tablets, and desktops are the most popular in the world. One major reason is their glossy and vibrant displays. Nothing beats the vibrancy from my old 27" Apple Cinema Display that I miss dearly. Guess everyone has to suffer a muddy & dull image because people are too lazy to control their room lighting or placement.
Click to expand...

2017 was supposed to see high-end OLED TV's with HFR (which tend to sport moth eye coatings), but it seems even those have been delayed. Guess matte 4K 144 Hz IPS really is going to be the best gamers will be afforded this year...


----------



## mmms

Quote:


> Originally Posted by *ToTheSun!*
> 
> 2017 was supposed to see high-end OLED TV's with HFR (which tend to sport moth eye coatings), but it seems even those have been delayed. Guess matte 4K 144 Hz IPS really is going to be the best gamers will be afforded this year...


Not this year only , I think this monitor Asus PG27UQ with 27'' , 4K , IPS , HDR , Quantum Dot , 144hz , Full array local dimming with 384 dimming zones , Wide gamut (DCI-P3 spec) and G-sync
will last for a long period of time .

I'm eager to see this monitor and Acer XB272-hdr . I think this monitor will defeat any 4k monitor with VA panel . I think this monitor with IPS + HDR + Quantum Dot will approach of the colors for oled panels .


----------



## sblantipodi

Quote:


> Originally Posted by *mmms*
> 
> Not this year only , I think this monitor Asus PG27UQ with 27'' , 4K , IPS , HDR , Quantum Dot , 144hz , Full array local dimming with 384 dimming zones , Wide gamut (DCI-P3 spec) and G-sync
> will last for a long period of time .
> 
> I'm eager to see this monitor and Acer XB272-hdr . I think this monitor will defeat any 4k monitor with VA panel . I think this monitor with IPS + HDR + Quantum Dot will approach of the colors for oled panels .


This monitor has amazing specs but there are three things that worry me.
1) the price: 2000€ is simply too much for a 27 inch gaming monitor that does not have professional feature like 3d lut and hardware calibration

2) quality control: Asus is well known to have a bad quality control. You can have a good monitor or you can have a bad monitor with backlight bleed and other problems.

3) active cooling: rear openings show clearly that the monitor is active cooled. I have a gaming PC, it is very noisy while gaming a nearly silent during normal operation. Monitor does not work that way, if it is noisy.


----------



## Oubadah

..


----------



## pez

Quote:


> Originally Posted by *MonarchX*
> 
> About 1080p - my point was that most people dismiss it as a high-end resolution without taking downacaling (to achieve its full visual potential) into consideration. Comparing native 1080p display with native 1080p image to native 4K display with native 4K image would result in greater visual difference than if 4K image was downscaled on 1080p display.
> 
> As far as Rec. 2020 goes - we have no idea when displays will be able to handle it. I did not expect 4K and/or HDR and/or OLED to become so popular so fast. IMHO, display technogies are pushing ahead of adoption and and we may see Rec. 2020 capable displays in 2017. Professional mastering studios surely have displays that can do Rec. 2020.


The problem is 1080p is a very CPU-bound resolution. Honestly, your 3770K will hold your 1080 G1 back before it's the other way around. Especially on GTA V. That's why I don't think 1080p is a high-end resolution. I don't think any CPU-bound resolution should be considered high end, honestly lol.


----------



## boredgunner

Quote:


> Originally Posted by *pez*
> 
> The problem is 1080p is a very CPU-bound resolution. Honestly, your 3770K will hold your 1080 G1 back before it's the other way around. Especially on GTA V. That's why I don't think 1080p is a high-end resolution. I don't think any CPU-bound resolution should be considered high end, honestly lol.


All that downsampling/supersampling will go straight to the GPU though, but I agree that 1080p is no longer an "enthusiast resolution" for gaming.


----------



## tconroy135

Quote:


> Originally Posted by *boredgunner*
> 
> All that downsampling/supersampling will go straight to the GPU though, but I agree that 1080p is no longer an "enthusiast resolution" for gaming.


1080P is too low these days, I agree, but I still think, even with high-end cards, the settings have to be turned down too much to get good framerates at 4k. I'll be sticking to 1440P until Volta, when I hope there will be a better display technology used for 4k G-Sync.


----------



## Kalimera

Samsung is going to release a 27" 1440p/144Hz VA+QD HDR monitor in 2017 at a probably much cheaper price point.


----------



## tconroy135

Quote:


> Originally Posted by *Kalimera*
> 
> Samsung is going to release a 27" 1440p/144Hz VA+QD HDR monitor in 2017 at a probably much cheaper price point.


I doubt it will have g-sync


----------



## MonarchX

Quote:


> Originally Posted by *tconroy135*
> 
> 1080P is too low these days, I agree, but I still think, even with high-end cards, the settings have to be turned down too much to get good framerates at 4k. I'll be sticking to 1440P until Volta, when I hope there will be a better display technology used for 4k G-Sync.


I do not want to start another flame war, but i would rather stick with 1080p monitor with good contrast than 8K IPS monitor with poor contrast even if I could run games at 8K at 60fps.

Alzo, G-Sync does not make up for excellent motion clarity at 120fps+, which is a lot easier to achieve at 1080p with GTX 1080 Ti than at higher resolutions.

Besides, there are games that run sub-60fps on my rig if I tweak them right, like modified GTA V with ENB MSAA or even Witcher when you set cascade shadows to extra high distance, and of course Fallout 4. Oh, let us not forget Watch Dogs 2 (with TXAA 4x).

My hands are itching as I am thinking of how Titanfall 2 and BF1 would run once I get it. I will not get consistent 120fps, but with OC I should be pulling off 100fps+.


----------



## tconroy135

Quote:


> Originally Posted by *MonarchX*
> 
> I do not want to start another flame war, but i would rather stick with 1080p monitor with good contrast than 8K IPS monitor with poor contrast even if I could run games at 8K at 60fps.
> 
> Alzo, G-Sync does not make up for excellent motion clarity at 120fps+, which is a lot easier to achieve at 1080p with GTX 1080 Ti than at higher resolutions.
> 
> Besides, there are games that run sub-60fps on my rig if I tweak them right, like modified GTA V with ENB MSAA or even Witcher when you set cascade shadows to extra high distance, and of course Fallout 4. Oh, let us not forget Watch Dogs 2 (with TXAA 4x).
> 
> My hands are itching as I am thinking of how Titanfall 2 and BF1 would run once I get it. I will not get consistent 120fps, but with OC I should be pulling off 100fps+.


^ is a good argument for ultra-high frame rate gaming. I still think a 1440p monitor is the right choice. You can always render at 1080p and have your monitor upscale the image. I think there are too many games that require no more than 50 fps to look smooth and have a lot of benefits. Also 4k will be great at 27" once big volta is released; AA will be redundant and that means game engines don't have to worry about designing for AA.


----------



## MonarchX

Are you kidding me? No flat panel should use a resolution below its native one. IMHO 1080p panel image at 1080p looks significantly better than the same image on 1440p panel that is set to use 1080p. Its different with downscaling from higher resolution to lower native one, but you suggest doing what Quantum Break did - it ran at 720p with MSAA and some FXAA and looked awful, worse than it did at 1080p witb no AA, which was possible once the Steam version was released.

Its a little different when it comes to movies. Color experts firy state that at a proper viewing distance people have a hard time seeing a difference between TV set to 720p and 1080p. I guess most PC gamers sit really close to their displays to see individual pixels. I sit at a distance at which I do not see individual pixels and at that distance 30" 4K DELL display I tried once was giving me a headacbe because I could barely read text, even though with my contacts I have 20/15 vision. I had to sit closer and that created a second problem - I could only concentrate on the central part of the screen and was forced to move my head while playing gamea to capture what was happening on the left and right sides of the screen. Normally, my central vision encompasses the entire 24" of my 24" 1080p display at the distance I prefer.


----------



## DzillaXx

Quote:


> Originally Posted by *MonarchX*
> 
> I do not want to start another flame war, but i would rather stick with 1080p monitor with good contrast than 8K IPS monitor with poor contrast even if I could run games at 8K at 60fps.
> 
> Alzo, G-Sync does not make up for excellent motion clarity at 120fps+, which is a lot easier to achieve at 1080p with GTX 1080 Ti than at higher resolutions.
> 
> Besides, there are games that run sub-60fps on my rig if I tweak them right, like modified GTA V with ENB MSAA or even Witcher when you set cascade shadows to extra high distance, and of course Fallout 4. Oh, let us not forget Watch Dogs 2 (with TXAA 4x).
> 
> My hands are itching as I am thinking of how Titanfall 2 and BF1 would run once I get it. I will not get consistent 120fps, but with OC I should be pulling off 100fps+.


I went from a 120hz 1080p higher end TN panel to a 1440p 96hz panel.

Sure I lost out on some smoothness, but OMG. Desktop area alone was worth the switch.

1080p is so limiting. Not as bad as 720p on a laptop screen, but getting pretty close. Though gaming isn't about desktop space, so I understand. I just don't get why people wouldn't go for 1440p simply for the bump in desktop space alone. Not only that but the ppi on a 24" 1080p monitor is pretty mediocre. I don't care how good your panel is.

1080p for gaming only, I do understand. Especially if this is for CS-GO @144hz+, but that is where i'm drawing the line. There are a number for really nice looking 1080p panels, but they were not 144hz+ monitors. Most of those gaming monitors had poor colors in favor of high hz. The better ones had VA panels to make up for the contrast, but still had meh colors compared to even a budget 1440p monitor. 1440p monitors tend to be made of higher quality panels, so it doesn't suffer as much as 1080p did with the large amount of crappy TN panels made for it.

To me this monitor is cool and kinda not ready yet. Personally I don't really feel like depending on scaling on windows to make my monitor usable. As there is no way in hell you are using a 27" 4K monitor without scaling the desktop. Second I believe Ultra Wide is the future of gaming, and 3440x1440 is the way to go right now if you are looking for a new monitor. Not 4K

Bring on the 3440x1440 144hz HDR Asus!


----------



## Boost240

I just bought a used 144hz Gysnc ULMB monitor. I cannot do low framerates anymore. Hence why I'll be upgrading to the 1080 TI . Overkill for 1080p? Maybe. But I love knowing I'll be able to run everything on high/ultra at 100fps+. Framerate is king to me, in every game


----------



## tconroy135

Quote:


> Originally Posted by *DzillaXx*
> 
> I went from a 120hz 1080p higher end TN panel to a 1440p 96hz panel.
> 
> Sure I lost out on some smoothness, but OMG. Desktop area alone was worth the switch.
> 
> 1080p is so limiting. Not as bad as 720p on a laptop screen, but getting pretty close. Though gaming isn't about desktop space, so I understand. I just don't get why people wouldn't go for 1440p simply for the bump in desktop space alone. Not only that but the ppi on a 24" 1080p monitor is pretty mediocre. I don't care how good your panel is.
> 
> 1080p for gaming only, I do understand. Especially if this is for CS-GO @144hz+, but that is where i'm drawing the line. There are a number for really nice looking 1080p panels, but they were not 144hz+ monitors. Most of those gaming monitors had poor colors in favor of high hz. The better ones had VA panels to make up for the contrast, but still had meh colors compared to even a budget 1440p monitor. 1440p monitors tend to be made of higher quality panels, so it doesn't suffer as much as 1080p did with the large amount of crappy TN panels made for it.
> 
> To me this monitor is cool and kinda not ready yet. Personally I don't really feel like depending on scaling on windows to make my monitor usable. As there is no way in hell you are using a 27" 4K monitor without scaling the desktop. Second I believe Ultra Wide is the future of gaming, and 3440x1440 is the way to go right now if you are looking for a new monitor. Not 4K
> 
> Bring on the 3440x1440 144hz HDR Asus!


I disagree on the ultra-wide. It's great for movies, but there is a significant portion of the gaming community unwilling to move their heads while gaming. I would love an Ultra Wide monitor for everything other than gaming, but for gaming I want to stare straight ahead.


----------



## drfouad

Does anyone know when that asus monitor coming out?


----------



## DzillaXx

Quote:


> Originally Posted by *tconroy135*
> 
> I disagree on the ultra-wide. It's great for movies, but there is a significant portion of the gaming community unwilling to move their heads while gaming. I would love an Ultra Wide monitor for everything other than gaming, but for gaming I want to stare straight ahead.


You don't need to turn your head. You simply get screen that covers your entire eye sight. Honestly until you try it, you just don't know what you are missing.

Your eyes do have more width than high to take it, and 21:9 captures on that. Honestly it is the best way to play a game, and on top of that, you gain FoV on any game that really supports the aspect ratio.

I have a good 2ft between me and my monitor, I don't need to move my head at all. I was using a 27" 1440p monitor before this.

Still 3440x1440 seems to be the new big thing these days, and for a good reason. It is awesome. Playing overwatch has never been better. Kills my old 27" 1440p monitor.


----------



## tconroy135

Quote:


> Originally Posted by *DzillaXx*
> 
> You don't need to turn your head. You simply get screen that covers your entire eye sight. Honestly until you try it, you just don't know what you are missing.
> 
> Your eyes do have more width than high to take it, and 21:9 captures on that. Honestly it is the best way to play a game, and on top of that, you gain FoV on any game that really supports the aspect ratio.
> 
> I have a good 2ft between me and my monitor, I don't need to move my head at all. I was using a 27" 1440p monitor before this.
> 
> Still 3440x1440 seems to be the new big thing these days, and for a good reason. It is awesome. Playing overwatch has never been better. Kills my old 27" 1440p monitor.


You can say that I suppose, but on pure bio-chemical reality your brain is making up most of your peripheral vision based on what it thinks is in your periphery. Beyond normal visual range most of what you are seeing is your brain's best guess as to what is there in reality.


----------



## DzillaXx

Quote:


> Originally Posted by *tconroy135*
> 
> You can say that I suppose, but on pure bio-chemical reality your brain is making up most of your peripheral vision based on what it thinks is in your periphery. Beyond normal visual range most of what you are seeing is your brain's best guess as to what is there in reality.


The Effect on peripheral vision because of the monitor's wide aspect ratio does cool things to your brain. Something I never got from my 27" 1440p.

Which is why HDR can't come soon enough to it. I honestly would probably sell this monitor for one with HDR and 144hz. vs my 100hz


----------



## MonarchX

Quote:


> Originally Posted by *DzillaXx*
> 
> I went from a 120hz 1080p higher end TN panel to a 1440p 96hz panel.
> 
> Sure I lost out on some smoothness, but OMG. Desktop area alone was worth the switch.
> 
> 1080p is so limiting. Not as bad as 720p on a laptop screen, but getting pretty close. Though gaming isn't about desktop space, so I understand. I just don't get why people wouldn't go for 1440p simply for the bump in desktop space alone. Not only that but the ppi on a 24" 1080p monitor is pretty mediocre. I don't care how good your panel is.
> 
> 1080p for gaming only, I do understand. Especially if this is for CS-GO @144hz+, but that is where i'm drawing the line. There are a number for really nice looking 1080p panels, but they were not 144hz+ monitors. Most of those gaming monitors had poor colors in favor of high hz. The better ones had VA panels to make up for the contrast, but still had meh colors compared to even a budget 1440p monitor. 1440p monitors tend to be made of higher quality panels, so it doesn't suffer as much as 1080p did with the large amount of crappy TN panels made for it.
> 
> To me this monitor is cool and kinda not ready yet. Personally I don't really feel like depending on scaling on windows to make my monitor usable. As there is no way in hell you are using a 27" 4K monitor without scaling the desktop. Second I believe Ultra Wide is the future of gaming, and 3440x1440 is the way to go right now if you are looking for a new monitor. Not 4K
> 
> Bring on the 3440x1440 144hz HDR Asus!


Yeah, I am just lucky to have the one and only 1080p monitor to this date that has static 4700:1 contrast ratio and 120Hz light-strobing (like ULMB). These days you can easily perform a full scale calibration (like I did) even on a monitor without any color calibration controls by using a good quality colorimeter like i1Display Pro, DisplayCAL with ArgyllCMS, and ReShade TuningPalette 3DLUT's, which go far beyond what any ICC profile can deliver, although only DirectX 9, 10, 11, and OpenGL are supported for now. All in all I get awesome contrast with rich inky blacks, clean and clear motion at 120Hz with light-strobing, and color accuracy of a well-calibrated IPS panel. When you also consider today's new AA types like TAA that can eradicate 99 percent of all aliasing (and if blurry, then if can be sharpened via in-game settings or ReShade LumaSharpen/Adaptive-Sharpen), as well as, OGSSAA, I would say that it all more than makes up for the lack of higher resolution, especially on a glossy screen!


----------



## boredgunner

Quote:


> Originally Posted by *MonarchX*
> 
> Yeah, I am just lucky to have the one and only 1080p monitor to this date that has static 4700:1 contrast ratio and 120Hz light-strobing (like ULMB). These days you can easily perform a full scale calibration (like I did) even on a monitor without any color calibration controls by using a good quality colorimeter like i1Display Pro, DisplayCAL with ArgyllCMS, and ReShade TuningPalette 3DLUT's, which go far beyond what any ICC profile can deliver, although only DirectX 9, 10, 11, and OpenGL are supported for now. All in all I get awesome contrast with rich inky blacks, clean and clear motion at 120Hz with light-strobing, and color accuracy of a well-calibrated IPS panel. When you also consider today's new AA types like TAA that can eradicate 99 percent of all aliasing (and if blurry, then if can be sharpened via in-game settings or ReShade LumaSharpen/Adaptive-Sharpen), as well as, OGSSAA, I would say that it all more than makes up for the lack of higher resolution, especially on a glossy screen!


The only thing is, TAA isn't enough to eradicate aliasing at 1080p (not even UE4 TAA which is by far the best I've seen), and not enough games have downsampling or SSAA. But I guess that's where GeDoSaTo and maybe VSR come in (DSR was limited to 60 Hz last time I checked).


----------



## MonarchX

Yeah, but TAA still does an incredibly good job and sometimes TAA + FXAA is used to further eradicate aliasing. DSR is not limited to 60Hz...


----------



## WorldExclusive

Has to be 32" minimum, semi-glossy.
I prefer 37"+ and glossy.

Matte is a no buy.


----------



## boredgunner

Quote:


> Originally Posted by *MonarchX*
> 
> Yeah, but TAA still does an incredibly good job and sometimes TAA + FXAA is used to further eradicate aliasing. DSR is not limited to 60Hz...


DSR being limited to 60 Hz is a game specific thing actually. Many games won't run at more than 60 Hz when using a DSR resolution.

FXAA is pretty much worthless, it'll just add some blur on top of TAA. The only time I've found TAA (and any additional shader AA I guess) to be sufficient is in UE4 games and at 1440p or larger. Here is Shadow Warrior 2 at 1080p with TAA for instance:



That is the average TAA implementation and the image is disturbingly aliased. Thankfully that game has OGSSAA. I actually played it on an Eizo Foris FG2421.


----------



## pez

Quote:


> Originally Posted by *boredgunner*
> 
> All that downsampling/supersampling will go straight to the GPU though, but I agree that 1080p is no longer an "enthusiast resolution" for gaming.


Oh no doubt.
Quote:


> Originally Posted by *tconroy135*
> 
> I disagree on the ultra-wide. It's great for movies, but there is a significant portion of the gaming community unwilling to move their heads while gaming. I would love an Ultra Wide monitor for everything other than gaming, but for gaming I want to stare straight ahead.


Just like positioning speakers correctly, if you do this with an ultrawide, you don't have to turn your head....so not sure what that's about.
Quote:


> Originally Posted by *DzillaXx*
> 
> You don't need to turn your head. You simply get screen that covers your entire eye sight. Honestly until you try it, you just don't know what you are missing.
> 
> Your eyes do have more width than high to take it, and 21:9 captures on that. Honestly it is the best way to play a game, and on top of that, you gain FoV on any game that really supports the aspect ratio.
> 
> I have a good 2ft between me and my monitor, I don't need to move my head at all. I was using a 27" 1440p monitor before this.
> 
> Still 3440x1440 seems to be the new big thing these days, and for a good reason. It is awesome. Playing overwatch has never been better. Kills my old 27" 1440p monitor.


I have to agree with this....though I can't stand to play Overwatch on 21:9. That vertical FOV cutoff is truly atrocious.


----------



## rvectors

Quote:


> Originally Posted by *drfouad*
> 
> Does anyone know when that asus monitor coming out?


Nobody is really sure, Q3 has been mooted but the problem is, there are no real pictures/videos of the ACER version, and the ASUS one might get a design change regarding the G-SYNC active cooling.... so sadly this may be late Q3, Q4 or even 2018. I hope not, as I'm really fed up with my current monitor. I've started looking for a good quality cheaper (i thought it would be) interim monitor on ebay that will be ok for now & then be good as a second monitor but people pay crazy prices here in the UK. Who pays £250 for a three year old 24 inch with 5 dead pixels, when new it's £400? Or nearly £260 for a monitor who's primary function is impaired by a bright vertical line top to bottom, admittedly prior to that it was a expensive 32 monitor but it shows how desperate we are to pay below the normal fleecing amounts, with poor QC.

I'm watching the launch of the ASUS PA32U, the release date is meant to be late Q1, early Q2, it has a similar tech panel but no high refresh or G-SYNC. They are different monitors but if released on time, it would at least give me hope that the other might be as timely.


----------



## sblantipodi

Quote:


> Originally Posted by *rvectors*
> 
> Nobody is really sure, Q3 has been mooted but the problem is, there are no real pictures/videos of the ACER version, and the ASUS one might get a design change regarding the G-SYNC active cooling.... so sadly this may be late Q3, Q4 or even 2018. I hope not, as I'm really fed up with my current monitor. I've started looking for a good quality cheaper (i thought it would be) interim monitor on ebay that will be ok for now & then be good as a second monitor but people pay crazy prices here in the UK. Who pays £250 for a three year old 24 inch with 5 dead pixels, when new it's £400? Or nearly £260 for a monitor who's primary function is impaired by a bright vertical line top to bottom, admittedly prior to that it was a expensive 32 monitor but it shows how desperate we are to pay below the normal fleecing amounts, with poor QC.
> 
> I'm watching the launch of the ASUS PA32U, the release date is meant to be late Q1, early Q2, it has a similar tech panel but no high refresh or G-SYNC. They are different monitors but if released on time, it would at least give me hope that the other might be as timely.


active cooling could be a real deal breaker for me


----------



## mmms

Quote:


> Originally Posted by *Kalimera*
> 
> Samsung is going to release a 27" 1440p/144Hz VA+QD HDR monitor in 2017 at a probably much cheaper price point.


Are you sure these monitors from samsung VA+Quantum Dot will come with HDR and G-sync ?
Quote:


> Originally Posted by *sblantipodi*
> 
> This monitor has amazing specs but there are three things that worry me.
> 1) the price: 2000€ is simply too much for a 27 inch gaming monitor that does not have professional feature like 3d lut and hardware calibration
> 
> 2) quality control: Asus is well known to have a bad quality control. You can have a good monitor or you can have a bad monitor with backlight bleed and other problems.
> 
> 3) active cooling: rear openings show clearly that the monitor is active cooled. I have a gaming PC, it is very noisy while gaming a nearly silent during normal operation. Monitor does not work that way, if it is noisy.


How is it possible to take advantage these professional feature like 3d lut and hardware calibration in 4k gaming monitor with 27'' ?
I don't know What is the use of these feature frankly .
Quote:


> Originally Posted by *sblantipodi*
> 
> active cooling could be a real deal breaker for me


Please further explanation about active cooling . and what is the purpose of it in this 4k gaming monitor ?


----------



## sblantipodi

Quote:


> Originally Posted by *mmms*
> 
> How is it possible to take advantage these professional feature like 3d lut and hardware calibration in 4k gaming monitor with 27'' ?
> I don't know What is the use of these feature frankly .


I'm a gamer but I love to do amateur photography, personally I don't like uncalibrated monitors, 3d lut helps reducing color space loss once the monitor is calibrated.

Hardware calibration is a must even for gamers.
If the monitor support hardware calibration you can load the color profile file (icc file produced by a software after a calibration using a colorimeter) into the monitor and not into the operating system.
This means that your monitor could be "color managed" even on "color unmanaged" software like games.
This are the feature that I would like to see on a monitor to justify a 2000€ cost.
Quote:


> Please further explanation about active cooling . and what is the purpose of it in this 4k gaming monitor ?


active cooling means that the monitor use fans to cool down the "processors" inside the monitor, gsync chip is one of those.
those fan are generally always on and they are small, small means loudy.
I hope that they will remove fans and will use better heatsink (passive cooling)


----------



## mmms

Actually, two questions:

1) What is the Gsync range? (Hope 30-144hz)

2) What is the percentage of DCI-P3? (Hope +90%)


----------



## boredgunner

Quote:


> Originally Posted by *mmms*
> 
> Actually, two questions:
> 
> 1) What is the Gsync range? (Hope 30-144hz)
> 
> 2) What is the percentage of DCI-P3? (Hope +90%)


Honestly, neither will be answered until it's reviewed. They have not spoken about either of those things in that kind of detail. G-SYNC range never has to be questioned though, unlike FreeSync, since it's always enormous, always goes to max refresh rate, and always includes frame multiplication when going below the refresh rate range (effectively making it somewhere around 10 FPS to 144 FPS).


----------



## Sedolf

It should be using the same quantum dot backlight as the PA32U I presume?
So that would be 95% DCI-P3, 99.5% Adobe RGB and 85% Rec.2020 then


----------



## tconroy135

What are people's opinions of HDR, how much of a benefit is it in reality?


----------



## DrFreeman35

Quote:


> Originally Posted by *tconroy135*
> 
> What are people's opinions of HDR, how much of a benefit is it in reality?


I am curious about this as well, only thing I can contribute to this is the HDR for Ps4 Pro & Xbox One S on a 70" TV......I can tell a Difference, but worth it? Still undecided, as I think it really relies on developers utilizing it.


----------



## st0necold

I have no idea why they keep saying this crap will be released... every year-- and it never comes out. I'm not upgrading my 980ti's until this thing comes out-- no sense when 980ti sli can max out 1440p on ultra in any god damn game.

Current monitor review: 3/10


----------



## boredgunner

Quote:


> Originally Posted by *boredgunner*
> 
> Wow, very good coverage.


Quote:


> Originally Posted by *tconroy135*
> 
> What are people's opinions of HDR, how much of a benefit is it in reality?


It really needs either OLED or FALD with a lot of dimming zones to be worthwhile. I've only tried it on the Samsung JS8500 with its local dimming enabled (edge mounted). It's worthless on that TV; it usually dims too much and the haloing is intolerable at times, although HDR content does look more colorful and rich from the wider colorspace.

On this monitor, I am guessing it will be worth using due to the 384 dimming zones.


----------



## Vipu

Quote:


> Originally Posted by *tconroy135*
> 
> What are people's opinions of HDR, how much of a benefit is it in reality?


Well it gives better contrast and better color depth.
So I guess you can set your windows colors 1 step lower to see the difference.


----------



## mmms

Quote:


> Originally Posted by *boredgunner*
> 
> On this monitor, I am guessing it will be worth using due to the 384 dimming zones.


I agree with you that this monitor is considered a future proof for all current and upcoming gaming monitos , but 2000$ is a very high price .
I think 1500$ or 1600$ is a reasonable price for best gamers .

I'm surprised that they know the importance of HDR for best picture and realistic colors and they made very high price for it .
They know that many of games will support HDR and these games will look nicer with these HDR gaming monitors than current gaming monitors which don't support HDR .
So why after all these facts about the future and the need for HDR in gaming moniors they make a very high price for thses HDR gaming monitors ?

The reason which will make many players are considering not buying it .


----------



## tconroy135

Quote:


> Originally Posted by *mmms*
> 
> I agree with you that this monitor is considered a future proof for all current and upcoming gaming monitos , but 2000$ is a very high price .
> I think 1500$ or 1600$ is a reasonable price for best gamers .
> 
> I'm surprised that they know the importance of HDR for best picture and realistic colors and they made very high price for it .
> They know that many of games will support HDR and these games will look nicer with these HDR gaming monitors than current gaming monitors which don't support HDR .
> So why after all these facts about the future and the need for HDR in gaming moniors they make a very high price for thses HDR gaming monitors ?
> 
> The reason which will make many players are considering not buying it .


I can't see buying this monitor, they need to use a new display technology IPS is old and done.


----------



## boredgunner

Quote:


> Originally Posted by *tconroy135*
> 
> I can't see buying this monitor, they need to use a new display technology IPS LCD is old and done.


Fixed that for you!


----------



## mmms

The main point here is if this monitor or Acer XB272-hdr already cost nearly 2000$ , the case will be difficult for many of the players who are waiting for this screen .

So what is the best solution for playing with current gaming monitors whether ( IPS or Tn or VA ) in order to get the best picture and colors which can be close to the colors for HDR ?

As a temporary solution until prices for these HDR gaming monitors fall and it will become accessible to everyone within the next three years .


----------



## ozlay

Quote:


> Originally Posted by *Yvese*
> 
> All I gotta say is if you pay anywhere near $1k for this you're crazy..
> 
> I bought my 65KS8000 for $1079. To pay near that price for a 27" monitor is madness. $600 is where I would draw the line, and even then that would be for 32" not 27.


And 65'' is too large for 4k... 65'' should be 8k minimal.


----------



## MonarchX

Hmm.... Maybe 384-zone local dimming isn't so bad. You can see some halo effect in this video 



 , but it is in slow motion. In fast motion, you can barely notice any halo - 



 . Looks like Full Array Local Dimming has potential, although prices, especially for G-Sync HDR 1440p/4K versions are high...


----------



## Neykov

Quote:


> Originally Posted by *ozlay*
> 
> And 65'' is too large for 4k... 65'' should be 8k minimal.


People forget that this are monitors: 20" - 27" and 32" max for 4k / 8k
All others are not monitors, they are TV screens !


----------



## boredgunner

Quote:


> Originally Posted by *Neykov*
> 
> People forget that this are monitors: 20 - 27 and maybe 32" max for 4k / 8k
> All others are not monitors, they are TV screens !


There are 40" monitors. I don't know of any bigger than that though.


----------



## Neykov

Quote:


> Originally Posted by *boredgunner*
> 
> There are 40" monitors. I don't know of any bigger than that though.


40" will never be monitor, don't care how they call it, it is TV screen for me and will never hurt myself with something like that
Quote:


> Originally Posted by *boredgunner*
> 
> There are 40" monitors. I don't know of any bigger than that though.


They might be, but i can't see myself sitting from far then a meter from a monitor so it is not a monitor


----------



## juano

Quote:


> Originally Posted by *MonarchX*
> 
> Hmm.... Maybe 384-zone local dimming isn't so bad. You can see some halo effect in this video
> 
> 
> 
> , but it is in slow motion. In fast motion, you can barely notice any halo -
> 
> 
> 
> . Looks like Full Array Local Dimming has potential, although prices, especially for G-Sync HDR 1440p/4K versions are high...


The second fast motion video you posted is the test video that you use to evaluate the dimming zones on a TV/monitor, not the actual test being done on a TV/monitor like the first video.


----------



## FreeElectron

Is there going to be a 40" version of this?
Or a 40" monitor that is similarly speced?


----------



## MonarchX

Quote:


> Originally Posted by *juano*
> 
> The second fast motion video you posted is the test video that you use to evaluate the dimming zones on a TV/monitor, not the actual test being done on a TV/monitor like the first video.


Thank you for clarification. I forgot that this will be an IPS screen and I think the first video's test was conducted on a VA panel, so the halo is likely to be more noticeable. Hopefully it transitions smoothly instead of being shown as abrupt squares with certain images...


----------



## ahmedmo1

I hope these screens get Dolby Vision and not just HDR 10. I also hope these HDR monitors aren't released any time soon. I don't have the restraint to not buy them.


----------



## boredgunner

Quote:


> Originally Posted by *ahmedmo1*
> 
> I hope these screens get Dolby Vision and not just HDR 10. I also hope these HDR monitors aren't released any time soon. I don't have the restraint to not buy them.


What use is Dolby Vision? It calls for Rec.2020 color space, 12-bit color depth, and peak brightness levels that today's display tech can't reach. Not to mention there are no 12-bit displays (anything 4k + HDR is running in 8-bit due to interface limitations), and there aren't even 100% DCI-P3 HDR displays yet let alone Rec.2020.


----------



## ahmedmo1

Quote:


> Originally Posted by *boredgunner*
> 
> What use is Dolby Vision? It calls for Rec.2020 color space, 12-bit color depth, and peak brightness levels that today's display tech can't reach. Not to mention there are no 12-bit displays (anything 4k + HDR is running in 8-bit due to interface limitations), and there aren't even 100% DCI-P3 HDR displays yet let alone Rec.2020.


I also hope they come with HDMI 2.1


----------



## t1337dude

Quote:


> Originally Posted by *boredgunner*
> 
> What use is Dolby Vision? It calls for Rec.2020 color space, 12-bit color depth, and peak brightness levels that today's display tech can't reach. Not to mention there are no 12-bit displays (anything 4k + HDR is running in 8-bit due to interface limitations), and there aren't even 100% DCI-P3 HDR displays yet let alone Rec.2020.


Well if it's the only HDR format available for the content you're viewing...


----------



## rvectors

Well I'm not sure the manufacturers need anymore excuses to add a price premium, thank you very much. It's already touted anywhere between $1200-2500, so throw-in a 12 bit panel and you're asking for trouble, why not add a self adjusting, optimal eye distance positioning unit, while your at it. Besides, don't we need a card (consumer or otherwise) that supports 12bit, although i'm sure you get some quality improvement regardless.

I'd rather they release something that negates most of the things I hate about IPS for the last 10 years, whilst improving the viewing experience quite a bit (subjectively)


----------



## Aristotelian

I'm pretty excited about this because I don't really see any competition in this market space on the horizon. I'm still using a Dell U2711 (if that's the model number) from 2010 or so that is 10 bit with great colour accuracy, 1440p, and 60Hz. But from the get go I missed the old days of my 100Hz + CRTs and now that I'm playing Overwatch a lot I want more speed.

I was originally eyeing the Dell 4k HDR OLED but apparently that was just vaporware and is not going to come out.

For a 27" monitor (the max for my desk), is there anything better than this coming out? HDR, high refresh rate, 4k - it ticked all the boxes for me.

Take my money later this year Asus, unless something better comes out. I'm just hoping it won't have terrible QC. And, on a monitor price point like this, it better not have a single dead or stuck pixel, a single issue with it, or yeah. There'll be hell to pay, hah.


----------



## Neykov

Quote:


> Originally Posted by *Aristotelian*
> 
> I'm pretty excited about this because I don't really see any competition in this market space on the horizon. I'm still using a Dell U2711 (if that's the model number) from 2010 or so that is 10 bit with great colour accuracy, 1440p, and 60Hz. But from the get go I missed the old days of my 100Hz + CRTs and now that I'm playing Overwatch a lot I want more speed.
> 
> I was originally eyeing the Dell 4k HDR OLED but apparently that was just vaporware and is not going to come out.
> 
> For a 27" monitor (the max for my desk), is there anything better than this coming out? HDR, high refresh rate, 4k - it ticked all the boxes for me.
> 
> Take my money later this year Asus, unless something better comes out. I'm just hoping it won't have terrible QC. And, on a monitor price point like this, it better not have a single dead or stuck pixel, a single issue with it, or yeah. There'll be hell to pay, hah.


Yeah, with that specs at least have to be my next monitor,
others doesn't make sense anymore.


----------



## mmms

Quote:


> Originally Posted by *Neykov*
> 
> I'm pretty excited about this because I don't really see any competition in this market space on the horizon. I'm still using a Dell U2711 (if that's the model number) from 2010 or so that is 10 bit with great colour accuracy, 1440p, and 60Hz. But from the get go I missed the old days of my 100Hz + CRTs and now that I'm playing Overwatch a lot I want more speed.
> 
> I was originally eyeing the Dell 4k HDR OLED but apparently that was just vaporware and is not going to come out.
> 
> For a 27" monitor (the max for my desk), is there anything better than this coming out? HDR, high refresh rate, 4k - it ticked all the boxes for me.
> 
> Take my money later this year Asus, unless something better comes out. I'm just hoping it won't have terrible QC. And, on a monitor price point like this, it better not have a single dead or stuck pixel, a single issue with it, or yeah. There'll be hell to pay, hah.


Quote:


> Originally Posted by *Neykov*
> 
> Yeah, with that specs at least have to be my next monitor,
> others doesn't make sense anymore.


Why ? the same spec with upcoming samsung gaming monitors VA + Quantum Dot + HDR + 144hz +1ms and ( freesync/G-sync ) plus it will cost roughly $1k .
The only difference between this Asus and samsung is Asus 4K and Samsung 2K .


----------



## ahmedmo1

Quote:


> Originally Posted by *mmms*
> 
> Why ? the same spec with upcoming samsung gaming monitors VA + Quantum Dot + HDR + 144hz +1ms and ( freesync/G-sync ) plus it will cost roughly $1k .
> The only difference between this Asus and samsung is Asus 4K and Samsung 2K .


If you're referring to the Samsung cf791 21:9 with hdr and 100 Hz, that monitor is quite inferior to this. It has 8 bit color, low max brightness, and poor back-lighting. So it isn't even really HDR.


----------



## mmms

Quote:


> Originally Posted by *ahmedmo1*
> 
> If you're referring to the Samsung cf791 21:9 with hdr and 100 Hz, that monitor is quite inferior to this. It has 8 bit color, low max brightness, and poor back-lighting. So it isn't even really HDR.


NO , i don't mean Samsung cf791 . I refer to CHG70 & CHG75 and SHG50 . Look this :-


__
https://www.reddit.com/r/5z4g7f/full_samsung_gaming_monitor_lineup_2017/


----------



## Malinkadink

Quote:


> Originally Posted by *mmms*
> 
> NO , i don't mean Samsung cf791 . I refer to CHG70 & CHG75 and SHG50 . Look this :-
> 
> 
> __
> https://www.reddit.com/r/5z4g7f/full_samsung_gaming_monitor_lineup_2017/


CHG70 & CHG75 both look really good, freesync or gsync 1440p VA QD HDR probably for around $600 at least the Freesync one. Is the SHG50 the same just flat? cause that'll be the one to get imo.


----------



## mmms

Quote:


> Originally Posted by *Malinkadink*
> 
> CHG70 & CHG75 both look really good, freesync or gsync 1440p VA QD HDR probably for around $600 at least the Freesync one. Is the SHG50 the same just flat? cause that'll be the one to get imo.


I agree with u . This is perfect monitor for me . Curved or Flat is not considered a big problem for me .


----------



## Aristotelian

Quote:


> Originally Posted by *mmms*
> 
> Why ? the same spec with upcoming samsung gaming monitors VA + Quantum Dot + HDR + 144hz +1ms and ( freesync/G-sync ) plus it will cost roughly $1k .
> The only difference between this Asus and samsung is Asus 4K and Samsung 2K .


The Asus is 4K and an IPS monitor, no? So those are the two differences.

However, if the Asus comes in at EUR 2000 and the Samsung at EUR 999, and there is no quantitative difference in image quality or HDR implementation, I'll probably go with the Samsung one. I'll have to wait for reviews, and hope tftcentral get them out asap when the monitors are released.


----------



## KenjiS

Quote:


> Originally Posted by *Kinaesthetic*
> 
> AU Optronics. AHVA panel (IPS type).
> 
> As for incredible contrast, it probably won't hit OLED levels, but with 384 zones of FALD, it will have one heck of good contrast for an IPS type. And it is DCI-P3 compliant (100% sRGB) with HDR10 support.
> 
> So why on earth do you doubt it? They flat out gave the specs right there.


I'd expect best contrast of any MONITOR on the market save OLEDs..

Not likely even a VA is going to be as good (Since none of them have FALD)

Probably best IQ on a monitor, and considering that the price tag isnt unreasonable to me...

-edit- And the new Samsung ones might be good competition, but still.. 4k might be nice to have to me...


----------



## rvectors

After having high hopes for the CF791, this time around I shall reserve judgement but I don't think at least for text clarity, there will be much difference.


----------



## CallsignVega

Quote:


> Originally Posted by *Aristotelian*
> 
> The Asus is 4K and an IPS monitor, no? So those are the two differences.
> 
> However, if the Asus comes in at EUR 2000 and the Samsung at EUR 999, and there is no quantitative difference in image quality or HDR implementation, I'll probably go with the Samsung one. I'll have to wait for reviews, and hope tftcentral get them out asap when the monitors are released.


Not comparable. The ASUS/Acer 4K 144 Hz monitors not only have G-Sync, but they are 1000 nit FALD backlights. Blows anything Samsung has announced out of the water.


----------



## Baasha

Hopefully they'll have examples of these monitors at E3. Oh wait, it's ASUS, never mind.


----------



## mmms

Quote:


> Originally Posted by *CallsignVega*
> 
> Not comparable. The ASUS/Acer 4K 144 Hz monitors not only have G-Sync, but they are 1000 nit FALD backlights. Blows anything Samsung has announced out of the water.


Quote:


> Originally Posted by *CallsignVega*
> 
> Not comparable. The ASUS/Acer 4K 144 Hz monitors not only have G-Sync, but they are 1000 nit FALD backlights. Blows anything Samsung has announced out of the water.


Don't forget VA has better blacks and great contrast ratio than IPS .
So i think the difference will be very slight between IPS + HDR + Quantum Dot + 1000 nit FALD backlights VS VA + HDR + Quantum Dot .

Is it worth $2k for Asus than $1k for samsung ? For me , NO .


----------



## boredgunner

Quote:


> Originally Posted by *mmms*
> 
> NO , i don't mean Samsung cf791 . I refer to CHG70 & CHG75 and SHG50 . Look this :-
> 
> 
> __
> https://www.reddit.com/r/5z4g7f/full_samsung_gaming_monitor_lineup_2017/%5B/URL
> 
> Don't forget VA has better blacks and great contrast ratio than IPS .
> So i think the difference will be very slight between IPS + HDR + Quantum Dot + 1000 nit FALD backlights VS VA + HDR + Quantum Dot .
> 
> Is it worth $2k for Asus than $1k for samsung ? For me , NO .


The 4k 144 Hz IPS monitors will actually have better blacks and contrast due to the FALD, at the cost of some amount of haloing.


----------



## mmms

Quote:


> Originally Posted by *boredgunner*
> 
> Interesting. No full array local dimming though. Assuming it has no scary overdrive artifacts like the CFG70, I'll only get it if I can't justify the price of the 4k 144 Hz monitors at the end of the year.
> The 4k 144 Hz IPS monitors will actually have better blacks and contrast due to the FALD, at the cost of some amount of haloing.


Yes , The 4k 144 Hz IPS monitors will actually have better blacks and contrast but $2k is a very high price . I'll get it between 1500$ to 1600$ .

I bet you if we saw the two of them Asus and Samsung and u can tell me the big difference between them especially with gaming by the human eye . I still see many of players happy with their Tn panels .
So , I think this upcoming samsung 2k + VA + HDR + Quantum Dot + 144hz + G-sync would be suitable for many players until we can buy oled panels with low prices in the near future .


----------



## boredgunner

Quote:


> Originally Posted by *mmms*
> 
> Yes , The 4k 144 Hz IPS monitors will actually have better blacks and contrast but $2k is a very high price . I'll get it between 1500$ to 1600$ .
> 
> I bet you if we saw the two of them Asus and Samsung and u can tell me the big difference between them especially with gaming by the human eye . I still see many of players happy with their Tn panels .
> So , I think this upcoming samsung 2k + VA + HDR + Quantum Dot + 144hz + G-sync would be suitable for many players until we can buy oled panels with low prices in the near future .


My budget is similar to yours then. If it's still $2000 by the year's end, and if the Samsung isn't FUBAR, then Samsung it is.

Personally I'd be able to discern the difference. 3000:1 contrast isn't superb, and if playing an HDR game, the peak brightness of the 4k 144 Hz monitors will stand out more. Too bad all of these monitors are bound to be matte.


----------



## sblantipodi

Quote:


> Originally Posted by *CallsignVega*
> 
> Not comparable. The ASUS/Acer 4K 144 Hz monitors not only have G-Sync, but they are 1000 nit FALD backlights. Blows anything Samsung has announced out of the water.


this must be proved before talk.
specs are nothing without good quality control and we all know that Asus monitors has one of the worst quality control in history.
leave alone that those specs says nothing about image quality like dE before and after a calibration, the deviation from the 2.2 gamma, the deviation from the 6500 temperature, the possibility to correctly calibrate the monitor to a desired cd/m2 and so on.


----------



## sblantipodi

Quote:


> Originally Posted by *mmms*
> 
> Don't forget VA has better blacks and great contrast ratio than IPS .
> So i think the difference will be very slight between IPS + HDR + Quantum Dot + 1000 nit FALD backlights VS VA + HDR + Quantum Dot .
> 
> Is it worth $2k for Asus than $1k for samsung ? For me , NO .


I hope that asus will not do the error of launching a gaming monitor at 2000 dollars.


----------



## Seyumi

Honestly at 27" the Samsung 1440p 144hz Gsync Curved VA HDR is looking more appealing:

1. No horrible windows / games DPI scaling
2. No way you're going to hit 144fps on 4K even with 2x Titan X Pascals on max settings so less FPS than 1440p.
3. Probably much cheaper
4. Curved monitor
5. VA looks much better for gaming vs IPS but this could be a wash due to the FALD dimming zones
6. Less system heat/noise due to less strenuous work

Now if this monitor was 32"+ I'd gun for the Asus/Acer 4k


----------



## boredgunner

Quote:


> Originally Posted by *Seyumi*
> 
> Honestly at 27" the Samsung 1440p 144hz Gsync Curved VA HDR is looking more appealing:
> 
> 1. No horrible windows / games DPI scaling
> 2. No way you're going to hit 144fps on 4K even with 2x Titan X Pascals on max settings so less FPS than 1440p.
> 3. Probably much cheaper
> 4. Curved monitor
> 5. VA looks much better for gaming vs IPS but this could be a wash due to the FALD dimming zones
> 6. Less system heat/noise due to less strenuous work
> 
> Now if this monitor was 32"+ I'd gun for the Asus/Acer 4k


1. Many games will be fine. Maybe all, depends on your eyesight I guess.
2. Not now in modern AAA games, but I can do 4k 144 FPS in many games I play, and most of us would keep the monitor for multiple GPU upgrades.
4. Not everyone prefers curved.
5. Yes, the 4k 144 Hz IPS should look better due to the FALD. It'd shatter Samsung's 3000:1 contrast. I'd like to see a dynamic contrast measurement for the 4k 144 Hz monitors, but we'll see how it measures sooner or later.
6. Meh, and I care a lot about silence, but I expect little difference here.


----------



## drfouad

is it out yet?


----------



## boredgunner

Quote:


> Originally Posted by *drfouad*
> 
> is it out yet?


Nope. ASUS said June 2017, which probably means Q4 this year or Q1-Q2 next.


----------



## Drebinx

Asus says Q3 2017...Im gonna go with Q4 or Q1 2018.

Really hope the panel is something decent i don't have space for a 55" oled and it sucks that pc monitors are just so far behind in picture quality.


----------



## mmms

Quote:


> Originally Posted by *Seyumi*
> 
> Honestly at 27" the Samsung 1440p 144hz Gsync Curved VA HDR is looking more appealing:
> 2. No way you're going to hit 144fps on 4K even with 2x Titan X Pascals on max settings so less FPS than 1440p.


Why even notice smooth gameplay with 144hz monitor , you should hit between 120 fps to 144 fps with G-sync to notice this smoothness ?
I saw many of players play with 1440p 144hz and G-sync with max settings between 70 fps to 100 fps in some games , and they didn't face any problems during playing .
I think the same thing with this Asus and Acer . Why should we hit 144 fps with this Asus 4k 144hz as long as between 100 fps to 120 fps with G-sync would be enough such as 2k 144hz gaming monitors ?

Quote:


> Originally Posted by *boredgunner*
> 
> 4. Not everyone prefers curved.


What is the differences between Flat and Curved gaming monitor with this size 27'' ?
Why do u prefer Flat than curved ?
I think with this size (27'') Flat or Curved won't be a big problem especially with gaming life .


----------



## boredgunner

Quote:


> Originally Posted by *mmms*
> 
> Why even notice smooth gameplay with 144hz monitor , you should hit between 120 fps to 144 fps with G-sync to notice this smoothness ?
> I saw many of players play with 1440p 144hz and G-sync with max settings between 70 fps to 100 fps in some games , and they didn't face any problems during playing .
> I think the same thing with this Asus and Acer . Why should we hit 144 fps with this Asus 4k 144hz as long as between 100 fps to 120 fps with G-sync would be enough such as 2k 144hz gaming monitors ?
> 
> What is the differences between Flat and Curved gaming monitor with this size 27'' ?
> Why do u prefer Flat than curved ?
> I think with this size (27'') Flat or Curved won't be a big problem especially with gaming life .


Yes, even 70-100 FPS with VRR demolishes 60 FPS. No arguments there.

As for flat vs curved, honestly I've never seen a curved 27" so I can't comment. It is all user preference, it also depends on how steep the curve is.


----------



## mmms

Quote:


> Originally Posted by *boredgunner*
> 
> Yes, even 70-100 FPS with VRR demolishes 60 FPS. No arguments there.
> 
> As for flat vs curved, honestly I've never seen a curved 27" so I can't comment. It is all user preference, it also depends on how steep the curve is.


Thank u bro very much , I'm one of the people who come to overclock for the sake of seeing your helpful responses .

We will see this Asus , Acer and Samsung and then we will decide between them the best or rather Is this Asus worth the extra $1k than Samsung or not ?


----------



## CallsignVega

You guys do realize the 144 Hz 4K FALD displays are the only ones announced that actually are capable of the 1000 nit LCD HDR spec right? All of these other "HDR" monitors are just regular crap edge lit ~350 nit displays with "HDR" label tacked on.


----------



## mmms

Quote:


> Originally Posted by *CallsignVega*
> 
> You guys do realize the 144 Hz 4K FALD displays are the only ones announced that actually are capable of the 1000 nit LCD HDR spec right? All of these other "HDR" monitors are just regular crap edge lit ~350 nit displays with "HDR" label tacked on.


How u find this case ? Is 1000 nit FALD backlights worth it with this size for gaming monitor ?
I think it is important in professional TV monitors between 50-65 inch than gaming monitors with 27 inch .


----------



## t1337dude

Quote:


> Originally Posted by *mmms*
> 
> How u find this case ? Is 1000 nit FALD backlights worth it with this size for gaming monitor ?
> I think it is important in professional TV monitors between 50-65 inch than gaming monitors with 27 inch .


Why the heck do you think the size of the screen matters?


----------



## Aristotelian

Whether or not it's Q3 2017 or Q1 2018 doesn't matter to me since this new build is turning out to be a long term project and if Q1 2018 (release of monitor) coincided with the release of the Volta Titan well....yeah.

What I'm concerned about is quality control. I won't mind paying EUR 2000 for the Asus monitor IF it is a winner, and it better be at that price point. I just hope it gets released, unlike that Dell OLED monitor I was dreaming about for months...


----------



## Asmodian

Quote:


> Originally Posted by *mmms*
> 
> How u find this case ? Is 1000 nit FALD backlights worth it with this size for gaming monitor ?
> I think it is important in professional TV monitors between 50-65 inch than gaming monitors with 27 inch .


HDR is only HDR if the dynamic range is high. With a normal backlight you cannot have the dynamic range in brightness be higher than a normal backlight.


----------



## rvectors

All we know so far, is it will be 'HDR like', it seems each manufacturer/nvidia has, as they always do, decided to interpret the standard differently. If you've see the few videos available showing the new ASUS, I don't care what they call it, or if it indeed whether it's true HDR 10 etc, the picture quality looked great, it didn't exhibit BLB or glow. My two worse gripes with today's IPS panels (apart from poor QA). There was some haloing but nothing I would mind as a compromise to remove the previous bugbears.


----------



## Egzi

If this monitor has 1000nits with HDR, wont that destroy the eyes up close? I remember having sore eyes after playing HDR games on my Samsung Ks8000 which I was sitting a good distance from.


----------



## Asmodian

Quote:


> Originally Posted by *rvectors*
> 
> All we know so far, is it will be 'HDR like', it seems each manufacturer/nvidia has, as they always do, decided to interpret the standard differently. If you've see the few videos available showing the new ASUS, I don't care what they call it, or if it indeed whether it's true HDR 10 etc, the picture quality looked great, it didn't exhibit BLB or glow. My two worse gripes with today's IPS panels (apart from poor QA). There was some haloing but nothing I would mind as a compromise to remove the previous bugbears.


BT.2020 has to be interpreted for every panel (10,000 nits anyone?) and there doesn't seem to be a good spec for how to do it. This is true of every HDR capable display and is why normal backlight displays call themselves HDR, they have a way of mapping HDR content to what they are capable of. Of course this mapping ends up creating SDR range content but because they don't look terrible when given HDR input they can call themselves HDR.

is HDR10 really a complete spec? It seems like all you need is 1000+ nits and DCI-P3, there is nothing about exactly how the tone mapping should be done. If you call this monitor 'HDR like' you have to call everything 'HDR like'.


----------



## boredgunner

Quote:


> Originally Posted by *Egzi*
> 
> If this monitor has 1000nits with HDR, wont that destroy the eyes up close? I remember having sore eyes after playing HDR games on my Samsung Ks8000 which I was sitting a good distance from.


Yeah it won't be good for your eyes, possibly painful at times especially in dark rooms. I want to try it though.


----------



## rvectors

Quote:


> Originally Posted by *Asmodian*
> 
> is HDR10 really a complete spec? It seems like all you need is 1000+ nits and DCI-P3, there is nothing about exactly how the tone mapping should be done. If you call this monitor 'HDR like' you have to call everything 'HDR like'.


I don't disagree with you, my point wasn't about being technically accurate to one standard or another but rather accepting, so far at least, the vaguely touted 'support' for HDR, and their different interpretations (G-SYNC HDR, Freesync 2, which is stricter I believe). I'd rather they bring this to market sooner, when it does actually improve upon the major problems with PC monitors, than develop something years down the line, because it faithfully follows a particular standard to the letter.


----------



## Asmodian

Quote:


> Originally Posted by *rvectors*
> 
> I don't disagree with you, my point wasn't about being technically accurate to one standard or another but rather accepting, so far at least, the vaguely touted 'support' for HDR, and their different interpretations (G-SYNC HDR, Freesync 2, which is stricter I believe). I'd rather they bring this to market sooner, when it does actually improve upon the major problems with PC monitors, than develop something years down the line, because it faithfully follows a particular standard to the letter.


Sorry, I should have made it clearer but I agree with you completely, except for the 'HDR like' comment. At this point everything is 'HDR like' or nothing is. No one is particularly strict about how HDR works, nothing I have seen of Freesync 2 has solid specs around how HDR tone mapping works or exactly what the minimum peak brightness and gamut are.

Dolby Vision is probably the most strict specification but they are only strict in that they have to be the ones to configure it, it is still possible to get a Dolby Vision HDR capable display with a somewhat low max brightness and a somewhat limited gamut. Even mastering content for HDR hasn't been properly standardized yet, there aren't any displays for studios that can display full BT.2020 so they have to compromise and they don't all compromise the same way.

It is also sad how real HDR is incompatible with current OLED technology, it will take a breakthrough to get OLED quality displays with a peak brightness near 1000 nits, let alone higher.

I thought a FALD monitor would never come out and we would be stuck with edge-lit monitors forever.









This display is on my "buy as soon as available" list.


----------



## CallsignVega

OLED displays have to hit 500 nits to be considered "HDR". It's because OLED infinite contrast and perfect blacks doesn't require much higher nits like LCD does for HDR. IMO OLED HDR is a superior experience to LCD HDR.

Another crazy thing is that 2017 TV sets are going away from FALD and back to edge lit. Probably for better profits, it's sickening.


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> Another crazy thing is that 2017 TV sets are going away from FALD and back to edge lit. Probably for better profits, it's sickening.


Wow, so the TV industry will be taking a big backwards step? Sickening indeed...


----------



## mmms

Quote:


> Originally Posted by *CallsignVega*
> 
> OLED displays have to hit 500 nits to be considered "HDR". It's because OLED infinite contrast and perfect blacks doesn't require much higher nits like LCD does for HDR. IMO OLED HDR is a superior experience to LCD HDR.
> 
> Another crazy thing is that 2017 TV sets are going away from FALD and back to edge lit. Probably for better profits, it's sickening.


You are right about this case OLED HDR vs LCD HDR . But Let's measure the case between upcoming gaming monitors this year ( LCD HDR ) :-

I think the same thing with upcoming Samsung CHG70 & CHG75 and SHG50 . They don't need 1000 nit such as upcoming Asus PG27UQ and Acer XB272-HDR because VA has higher contrast and perfect blacks than IPS in Asus PG27UQ and Acer XB272-HDR . So the difference between them (VA and IPS ) is the main reason to hit 1000 nit or higher or lower .

Sometimes increasing the dose of the drug lead to death not healing .


----------



## aberrero

Quote:


> Originally Posted by *CallsignVega*
> 
> OLED displays have to hit 500 nits to be considered "HDR". It's because OLED infinite contrast and perfect blacks doesn't require much higher nits like LCD does for HDR. IMO OLED HDR is a superior experience to LCD HDR.
> 
> Another crazy thing is that 2017 TV sets are going away from FALD and back to edge lit. Probably for better profits, it's sickening.


If LG had better marketing other manufacturers would be out of business. OLED is so much better than anything else out there right now.


----------



## dboythagr8

June 2017? So Nov-Dec 2017, Most likely 2018

My limit for this is $1,600

I may have to find some solution in the interim because my 2x 1080tis are being starved at 1440p


----------



## boredgunner

Quote:


> Originally Posted by *dboythagr8*
> 
> June 2017? So Nov-Dec 2017, Most likely 2018
> 
> My limit for this is $1,600
> 
> I may have to find some solution in the interim because my 2x 1080tis are being starved at 1440p


Look at the bright side: 2x GTX 1080 Ti SLI will likely allow you to use 120 Hz or 144 Hz blur reduction at 1440p in many games. This is why I'm not totally against going with that Samsung 1440p 144 Hz G-SYNC monitor in the future, since my budget is roughly the same as yours and these 4k 144 Hz monitors might go beyond that.


----------



## Baasha

Quote:


> Originally Posted by *boredgunner*
> 
> Nope. *ASUS said June 2017, which probably means Q4 this year or Q1-Q2 next.*










This should be stickied.. lel... if we see this monitor on the market before Dec. 2017, I'd be surprised.


----------



## tconroy135

Quote:


> Originally Posted by *Baasha*
> 
> 
> 
> 
> 
> 
> 
> 
> This should be stickied.. lel... if we see this monitor on the market before Dec. 2017, I'd be surprised.


Acer is going to release the same panel, I'm sure, and much sooner.


----------



## boredgunner

Quote:


> Originally Posted by *tconroy135*
> 
> Acer is going to release the same panel, I'm sure, and much sooner.


Yeah, they're usually first. Acer's is called XB272-HDR.


----------



## rvectors

Quote:


> Originally Posted by *boredgunner*
> 
> Yeah, they're usually first. Acer's is called XB272-HDR.


With an AI that scrubs any video of it's existence from the internet... I've watched a number of videos from CES and the reviewers said the ACER monitor was there but then never took any photographic evidence. Even the Nvidia video, showed the wrong monitor when they were talking about the HDR panels. I guess it must be under NDA or something but I do hope ACER 'springs' a surprise, as in spring 2017!


----------



## Oubadah

..


----------



## ToTheSun!

Quote:


> Originally Posted by *Oubadah*
> 
> Is there any reason to believe that the Samsung monitor won't be another bug ridden piece of junk, devoid of QC, like the rest of the IPS G-Sync monitors?


Plenty, actually. Firstly, Samsung builds the monitors in their entirety, which allows them to more closely assure quality. But that alone doesn't mean they'll do it. However, based on their latest batch of monitors, their QC problems are more pervasive on the software side of things, with various issues with Freesync and some aggressiveness on overdrive to keep response times low. With G-sync, this would be ameliorated, and you'd still get a calibration report from factory, a lower tendency to get dead pixels and dust behind the coating, and much better strobing with their multiple step strobed backlights.


----------



## rvectors

Quote:


> Originally Posted by *Oubadah*
> 
> Is there any reason to believe that the Samsung monitor won't be another bug ridden piece of junk, devoid of QC, like the rest of the IPS G-Sync monitors?
> 
> I need a monitor so bad, but every time I start researching one I run into an endless bug thread and/or an immense volume of user reports of what I consider to be totally unacceptable uniformity issues.


Tell me about it, I think I've already said somewhere but I'm still on a Dell 17, that must be 10 years old now, no BLB, dead pixels, glow or uniformity problems. I did try some expensive alternatives but they came with all the issues we love to hate... the kicker, when I calibrated the screens, they weren't any more accurate than the old panel!

As to Samsung, this was a real disappointment, the concern for me apart from the colour shift/free sync problems and bleed on a bleeding VA for god sake, was the dead pixels that would crop up after only a short time of use (CF791).

Maybe they rushed the releases but I don't have as much faith as ToTheSun. If the CHG75 ends up with G-SYNC HDR, I'd be more hopeful.

I'm not sure but was the poor text clarity on the CF791, just because it was 3440x1440 on 34 inches, or because it was VA?


----------



## boredgunner

Quote:


> Originally Posted by *ToTheSun!*
> 
> Plenty, actually. Firstly, Samsung builds the monitors in their entirety, which allows them to more closely assure quality. But that alone doesn't mean they'll do it. However, based on their latest batch of monitors, their QC problems are more pervasive on the software side of things, with various issues with Freesync and some aggressiveness on overdrive to keep response times low. With G-sync, this would be ameliorated, and you'd still get a calibration report from factory, a lower tendency to get dead pixels and dust behind the coating, and much better strobing with their multiple step strobed backlights.


This. The main issues to be concerned of are its response times and overdrive implementation.


----------



## BoredErica

Quote:


> Originally Posted by *mmms*
> 
> Why ? the same spec with upcoming samsung gaming monitors VA + Quantum Dot + HDR + 144hz +1ms and ( freesync/G-sync ) plus it will cost roughly $1k .
> The only difference between this Asus and samsung is Asus 4K and Samsung 2K .


What about local dimming though?

Quote:


> Originally Posted by *CallsignVega*
> 
> You guys do realize the 144 Hz 4K FALD displays are the only ones announced that actually are capable of the 1000 nit LCD HDR spec right? All of these other "HDR" monitors are just regular crap edge lit ~350 nit displays with "HDR" label tacked on.
> Thought the Asus at least with local dimming means it's not edge lit, and I read on 144hz website that the Acer has HDR10, which to my knowledge is an actual HDR spec.


Quote:


> Originally Posted by *boredgunner*
> 
> Nope. ASUS said June 2017, which probably means Q4 this year or Q1-Q2 next.


----------



## boredgunner

Quote:


> Originally Posted by *Darkwizzie*
> 
> What about local dimming though?


If the Samsungs will have local dimming, then it will be edge lit, like all of their local dimming TVs. I don't consider edge local dimming worthwhile, based on my experience with my 2014 Vizio TV (don't remember the model) and the Samsung JS8500.


----------



## Ripple

I apologize if this has already been asked. So in order to use the PG27UQ at 144Hz, I would have to upgrade to an NVIDIA GeForce GTX 10 Series card? I believe you need DisplayPort 1.4 to output 144Hz at 4K using Display Stream Compression. Thanks


----------



## CeeeJaaay

Quote:


> Originally Posted by *Ripple*
> 
> I apologize if this has already been asked. So in order to use the PG27UQ at 144Hz, I would have to upgrade to an NVIDIA GeForce GTX 10 Series card? I believe you need DisplayPort 1.4 to output 144Hz at 4K using Display Stream Compression. Thanks


Yes.


----------



## pez

Quote:


> Originally Posted by *Oubadah*
> 
> Is there any reason to believe that the Samsung monitor won't be another bug ridden piece of junk, devoid of QC, like the rest of the IPS G-Sync monitors?
> 
> I need a monitor so bad, but every time I start researching one I run into an endless bug thread and/or an immense volume of user reports of what I consider to be totally unacceptable uniformity issues.


I'm not sure where you're from, but if you're in the US, Amazon makes it pretty easy to get around most of the initial QC issues you'd find. With things like the x34, if you're not seeing any issues within the first 30 days, there's a good chance you won't see any after that. Not to say I haven't seen reports that say otherwise (I've seen various things on Reddit), but it's not a reason to be scared to buy a monitor.


----------



## Oubadah

..


----------



## BoredErica

Quote:


> Originally Posted by *boredgunner*
> 
> If the Samsungs will have local dimming, then it will be edge lit, like all of their local dimming TVs. I don't consider edge local dimming worthwhile, based on my experience with my 2014 Vizio TV (don't remember the model) and the Samsung JS8500.


I'm still looking for 1440p, but it looks like for the top end 1440p monitor Samsung's the one to go to.

I know this has been repeated many times in this thread alone... but I'm curious to see what level of QC we're gonna get from Asus this time around.


----------



## pez

Quote:


> Originally Posted by *Oubadah*
> 
> I'm in New Zealand, so there's barely anywhere that offers no-questions-asked returns without heavy restocking fees, and even fewer that offer them for a reasonable time-frame. These days you can easily be looking at 5 RMAs just to get 1 unit without glaring defects, so that's why I try to stick to Dell and HP who offer hassle-free returns with paid shipping and advanced replacements. Buying other brands here involves going through some smaller retailer/etailer, meaning a lot of time and/or money wasted on returns and fights with staff who "can't replicate" the issues.


I definitely understand the fear of the IPS panels from Acer and Asus then. It's the reason I specifically went with Amazon on my x34.


----------



## CallsignVega

Quote:


> Originally Posted by *Oubadah*
> 
> I'm in New Zealand, so there's barely anywhere that offers no-questions-asked returns without heavy restocking fees, and even fewer that offer them for a reasonable time-frame.


That's the price you pay for living in a far away beautiful land with the Hobbits!


----------



## Reliantdan

Has anyone got confirmation on the price of these? You know, for Q1 2020?

I have two 1080 SLI powering 3 "older" Asus VG248QE in portrait, and now I'm looking for an upgrade. I believe I copied an old setup of CallsignVega back in the day but these bezels are a little too thick compared to the new monitors. SO, I'm looking for one super nice monitor or doing three again but with smaller bezels, not sure I want to wait until next year.


----------



## Aristotelian

No confirmation of price or release date. I think Asus themselves said Q3 2017 but I highly doubt they'll make that release date. The Dell OLED 4k monitor also appears to have disappeared. I think higher end monitors and the quality control issues that seem to plague them (at least in the gaming sphere) mean that people will think twice before spending EUR 2000 on one of these unless they can be damned sure that it's going to live up to expectations. And, with these stats, expectations are indeed high.

I check avsforums and other forums too for an HDR TV purchase and some posters there are swearing by the LED/dimmable Sony HDR TVs, as an example. So I'm rather hopeful that this monitor hits it out of the park - HDR 10, high refresh rate, IPS, dimmable zones etc.


----------



## Sedolf

Other than the 49'' Sony X900E Is there even a TV this year that is below 50'', with 4:4:4, FALD and HDR?

http://ca.rtings.com/tv/reviews/sony/x900e
https://www.avforums.com/review/sony-kd65xe9005-x90e-tv-review.13437
https://hdtvpolska.com/sony-xe9005-kd-55xe9005-test-android-tv-2017-4k-hdr/


----------



## rvectors

LG's 32UD99 coming in under $1000, handily creates a case for the ASUS/ACER models being a fair bit lower than $2000-2500. Yes it doesn't have high refresh or G-SYNC but it's 32 inch and supposedly with support for HDR10. My guess has always been nearer to $1600 when they actually come to release it.


----------



## boredgunner

Quote:


> Originally Posted by *rvectors*
> 
> LG's 32UD99 coming in under $1000, handily creates a case for the ASUS/ACER models being a fair bit lower than $2000-2500. Yes it doesn't have high refresh or G-SYNC but it's 32 inch and supposedly with support for HDR10. My guess has always been nearer to $1600 when they actually come to release it.


It doesn't create much of a case since it is just a wide gamut IPS monitor, nothing has suggested it has local dimming so it probably doesn't have it which means HDR is completely useless on it. Then it's 60 Hz like you said.

The FreeSync 2 versions of the PG27UQ and XB272-HDR will be well under $2000.


----------



## CallsignVega

Even though the LG is a beautiful monitor, people really aren't going to cross shop fixed 60 Hz 4K monitors with G-Sync 144 Hz 4K FALD monitors.


----------



## LunaTiC123

Quote:


> Originally Posted by *boredgunner*
> 
> It doesn't create much of a case since it is just a wide gamut IPS monitor, nothing has suggested it has local dimming so it probably doesn't have it which means HDR is completely useless on it. Then it's 60 Hz like you said.
> 
> The FreeSync 2 versions of the PG27UQ and XB272-HDR will be well under $2000.


idk man, aren't freesync versions of gsync monitors using the same panels and features 150-200$ cheaper at most? atleast in europe not sure about the US


----------



## Aristotelian

Quote:


> Originally Posted by *Sedolf*
> 
> Other than the 49'' Sony X900E Is there even a TV this year that is below 50'', with 4:4:4, FALD and HDR?
> 
> http://ca.rtings.com/tv/reviews/sony/x900e
> https://www.avforums.com/review/sony-kd65xe9005-x90e-tv-review.13437
> https://hdtvpolska.com/sony-xe9005-kd-55xe9005-test-android-tv-2017-4k-hdr/


Not to my knowledge, no. But I was looking at reviews of the Sony XBR75X940D and yes, it's large - but it is HDR. And apparently HDR content on this set looks stunning. It is FALD + HDR but you might know more about the specifics than I do.

Basically, if this Asus monitor is FALD + HDR 10 + 4:4:4, and 144 hz - with good quality control, I'll be picking one up (and I'm fairly elastic about cost if the quality is good). Since I've seen tvs with the technology and great picture quality I'm hoping this monitor can live up to the hype by extension - unsure if that's a fair extrapolation to make.

Edited to add: And I live in a small island country where returning a monitor like this to Amazon or whatever will take so much time etc. that I really need to be reassured about the quality control before I even buy a monitor like this.


----------



## Excession

What I don't get is why they haven't announced a 4K 120hz monitor _without_ the FALD/HDR. That would presumably sell for substantially less than this will, and would thus be affordable to a lot more people.
Quote:


> Originally Posted by *Aristotelian*
> 
> Basically, if this Asus monitor is FALD + HDR 10 + 4:4:4, and 144 hz - with good quality control, I'll be picking one up (and I'm fairly elastic about cost if the quality is good). Since I've seen tvs with the technology and great picture quality I'm hoping this monitor can live up to the hype by extension - unsure if that's a fair extrapolation to make.


It's not 4:4:4. Displayport 1.4 doesn't actually have any more bandwidth than 1.3. The only reason it can do 4K _and_ HDR _and_ 120hz at the same time is because it has a "visually lossless" compression scheme for 4:2:0 content.


----------



## CallsignVega

Quote:


> Originally Posted by *Excession*
> 
> What I don't get is why they haven't announced a 4K 120hz monitor _without_ the FALD/HDR. That would presumably sell for substantially less than this will, and would thus be affordable to a lot more people.
> It's not 4:4:4. Displayport 1.4 doesn't actually have any more bandwidth than 1.3. The only reason it can do 4K _and_ HDR _and_ 120hz at the same time is because it has a "visually lossless" compression scheme for 4:2:0 content.


Monitors don't use YCbCr signals, they use RGB. But I am curious to know how DSC will compress RGB to fit within the bandwidth envelope.

https://www.vesa.org/featured-articles/vesa-updates-display-stream-compression-standard-to-support-new-applications-and-richer-display-content/

Only talks about YCbCr 4:2:0 and 4:2:2 video formats commonly used in digital TVs. Although briefly mentions 3:1 compression with RGB.


----------



## rvectors

@CallsignVega,@Boredgunner:

I don't disagree but I don't specifically mean a direct effect, i.e this monitor is competing with the ASUS for the exact same customers. I mean the mere introduction of a decent spec 32 inch HDR 'capable' monitor, at a good price point, will have a general influence on prices. With more to follow and the fact that this monitor wont be out for a while, should have some feed-in to the release price.

I'm pretty sure we wont see $2000 for the eventual release price.


----------



## Dragonsyph

Is there even a cable that can run this monitor? Or are you gonna have to use more then 1?


----------



## boredgunner

Quote:


> Originally Posted by *LunaTiC123*
> 
> idk man, aren't freesync versions of gsync monitors using the same panels and features 150-200$ cheaper at most? atleast in europe not sure about the US


That's probably what the price difference will be, yeah.
Quote:


> Originally Posted by *Dragonsyph*
> 
> Is there even a cable that can run this monitor? Or are you gonna have to use more then 1?


DisplayPort 1.4


----------



## mmms

Can DP 1.4 even do 4K @ 144hz with 10 bit color ?

And what sort of GPU power do you need to power 4K @ 144hz with HDR ?


----------



## guttheslayer

Quote:


> Originally Posted by *mmms*
> 
> Can DP 1.4 even do 4K @ 144hz with 10 bit color ?
> 
> And what sort of GPU power do you need to power 4K @ 144hz with HDR ?


Minimum a pair of 1080 ti for sure.

If single gpu u need to wait for big titan volta


----------



## boredgunner

Quote:


> Originally Posted by *mmms*
> 
> Can DP 1.4 even do 4K @ 144hz with 10 bit color ?
> 
> And what sort of GPU power do you need to power 4K @ 144hz with HDR ?


It's going to be 8-bit color using DSC compression.


----------



## Malinkadink

Quote:


> Originally Posted by *boredgunner*
> 
> That's probably what the price difference will be, yeah.
> DisplayPort 1.4


I don't think the price difference will be a mere $100-200. The higher up the ladder you go as far as gsync displays are concerned the more they'll charge for them in premiums because its top of the line and it has gsync ie the ultrawides are all over a grand. Freesync variants of similar ultrawides can be had for as much as $400 less. I'd expect to see at most $500 less for a freesync version of these 4k 144hz monitors. Nvidia likes to gouge a bit more when a monitor is more premium.


----------



## mmms

After watching many of TV screens which use this great feature ( FALD ) (full array local dimming) , What is the best option for 4k TV or Gaming monitor ?

1) 4K IPS + HDR + FALD .

2) 4K VA + HDR + FALD .

Is there a big difference between IPS and VA with FALD ?


----------



## boredgunner

Quote:


> Originally Posted by *Malinkadink*
> 
> I don't think the price difference will be a mere $100-200. The higher up the ladder you go as far as gsync displays are concerned the more they'll charge for them in premiums because its top of the line and it has gsync ie the ultrawides are all over a grand. Freesync variants of similar ultrawides can be had for as much as $400 less. I'd expect to see at most $500 less for a freesync version of these 4k 144hz monitors. Nvidia likes to gouge a bit more when a monitor is more premium.


That makes sense. If that turns out to be true then it's good news for AMD users.
Quote:


> Originally Posted by *mmms*
> 
> After watching many of TV screens which use this great feature ( FALD ) (full array local dimming) , What is the best option for 4k TV or Gaming monitor ?
> 
> 1) 4K IPS + HDR + FALD .
> 
> 2) 4K VA + HDR + FALD .
> 
> Is there a big difference between IPS and VA with FALD ?


The difference will be in the haloing effect, which will be worse on IPS. But how much worse? We're going to have to wait for reviews or our own impressions.


----------



## mmms

I hope this Asus PG27UQ and Acer XB272-HDR overcome all known issues with IPS panel Thanks to FALD + IPS + HDR .

I don't like to see in dark setting the blacks turn gray again with these monitors .


----------



## Nammi

Quote:


> Originally Posted by *boredgunner*
> 
> It's going to be 8-bit color using DSC compression.


So to get 10-bit we're going to have to hope that this screen comes with HDMI 2.1?


----------



## boredgunner

Quote:


> Originally Posted by *Nammi*
> 
> So to get 10-bit we're going to have to hope that this screen comes with HDMI 2.1?


Can HDMI 2.1 do 4k 144 Hz with 10-bit color? I'd be shocked if it has more bandwidth than DP1.4.


----------



## CallsignVega

Surprisingly, HDMI 2.1 has _significantly_ more bandwidth than Displayport 1.4.

48 Gbps versus 32.4 Gbps.


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> Surprisingly, HDMI 2.1 has _significantly_ more bandwidth than Displayport 1.4.
> 
> 48 Gbps versus 32.4 Gbps.


Wow. It's probably too late for the PG27UQ to have HDMI 2.1 though, and we don't have GPUs that support it.


----------



## Benny89

If only it was bigger, like 32-40'... Who would want 27' 4K. 27' 2K 144Hz is now too small for me.

Ech...I hope they will release in 2018 bigger 4k 144Hz monitors.


----------



## aberrero

I've been using a 55" OLED, which is basically a 2x2 27" Eyefinity setup. It will definitely take some getting used to to go back to such a small display size.

I would have much preferred a 38" 1600p ultrawide with this quality level. It's much easier to drive at 144hz too.


----------



## Benny89

Quote:


> Originally Posted by *aberrero*
> 
> I've been using a 55" OLED, which is basically a 2x2 27" Eyefinity setup. It will definitely take some getting used to to go back to such a small display size.
> 
> I would have much preferred a 38" 1600p ultrawide with this quality level. It's much easier to drive at 144hz too.


How is your experience with 55 OLED? I am talking about playing single player games as I know TVs are not for any competetive PvP gameplay.

Can you please share. I am thinking about buying 55 OLED since there is not even one worth (for me) monitor coming. All too small... 4K 144Hz I would buy in heartbeat but only 32' plus (preferable 40').


----------



## l88bastar

Quote:


> Originally Posted by *Benny89*
> 
> How is your experience with 55 OLED? I am talking about playing single player games as I know TVs are not for any competetive PvP gameplay.
> 
> Can you please share. I am thinking about buying 55 OLED since there is not even one worth (for me) monitor coming. All too small... 4K 144Hz I would buy in heartbeat but only 32' plus (preferable 40').


I have the 55C6 OLED.....its amazing if you can tolerate 60hz. Its even good for FPS if you don't run vsync, but then you gotta deal with the tears









But the OLED will ruin everything else for you.....after OLED, everything else is.....well its everything else.


----------



## OwnedINC

Quote:


> Originally Posted by *Benny89*
> 
> Who would want 27' 4K. 27' 2K 144Hz


This guy, this guy right here.


----------



## t1337dude

Quote:


> Originally Posted by *l88bastar*
> 
> I have the 55C6 OLED.....its amazing if you can tolerate 60hz. Its even good for FPS if you don't run vsync, but then you gotta deal with the tears
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But the OLED will ruin everything else for you.....after OLED, everything else is.....well its everything else.


I just put my order in for the 65C7









Never used a 4k display before. Super excited for HDR.

To relate it back to this topic, I was considering a new display, but figured the OLED was a much better use of money. Will use it for everything (gaming included). This year only has 21ms input lag and might even have 1080p 120Hz (giving even less lag).


----------



## Benny89

Quote:


> Originally Posted by *t1337dude*
> 
> I just put my order in for the 65C7
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Never used a 4k display before. Super excited for HDR.
> 
> To relate it back to this topic, I was considering a new display, but figured the OLED was a much better use of money. Will use it for everything (gaming included). This year only has 21ms input lag and might even have 1080p 120Hz (giving even less lag).


Y, I sort of agree. Seeing that small monitors like those are getting 2kUSD price point, I also think it's better to add a little more and get OLED LG 7-series. OLED is future. LCDs are not only old tech, overpriced but also prices are starting to get ridiculous enough (27' for 2k USD, seriously?) to justify getting OLED. I have also PS4 Pro, but Xbox One S is ordered and on its way so I will have one screen to rule them all







.


----------



## CallsignVega

Quote:


> Originally Posted by *t1337dude*
> 
> I just put my order in for the 65C7
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Never used a 4k display before. Super excited for HDR.
> 
> To relate it back to this topic, I was considering a new display, but figured the OLED was a much better use of money. Will use it for everything (gaming included). This year only has 21ms input lag and might even have 1080p 120Hz (giving even less lag).


You are going to love it. OLED image quality for gaming is quite jaw dropping.


----------



## Benny89

Quote:


> Originally Posted by *CallsignVega*
> 
> You are going to love it. OLED image quality for gaming is quite jaw dropping.


I just read that HDMI 2.1 will probably implemented in 2018. That makes me think it might be better to wait till 2018 to grab OLED with HDMI 2.1 and be able to go 4K 120 fps.

Also HDMI 2.1 will implement VRR Game Mode, which will basicelly be G-Sync/Free-Sync for TVs.

What do you guys think? Its "only" an year. Not that long in tech world.


----------



## dboythagr8

Quote:


> Originally Posted by *guttheslayer*
> 
> Minimum a pair of 1080 ti for sure.
> 
> If single gpu u need to wait for big titan volta


I don't understand why folks keep saying this?

You're not going to need a "minimum" of two Ti's for this monitor. Just because it's 144hz doesn't necessarily mean everything has to run at 144fps. There will be a gsync sweet spot for fps just as there is for any other gsync monitor. People can adjust settings the same way they do on current 1440p/144hz monitors.

I just don't want people reading things like this and thinking "well, i'm on a 1070, guess I can't buy and enjoy this monitor", because that's not true.


----------



## boredgunner

Quote:


> Originally Posted by *dboythagr8*
> 
> I don't understand why folks keep saying this?
> 
> You're not going to need a "minimum" of two Ti's for this monitor. Just because it's 144hz doesn't necessarily mean everything has to run at 144fps. There will be a gsync sweet spot for fps just as there is for any other gsync monitor. People can adjust settings the same way they do on current 1440p/144hz monitors.
> 
> I just don't want people reading things like this and thinking "well, i'm on a 1070, guess I can't buy and enjoy this monitor", because that's not true.


Not to mention, just with my one GTX 1080 I can run many of the best games of various genres at 4k and 120+ FPS. I don't limit myself to only playing the most recent games, which are usually disappointing and lackluster at best.


----------



## dboythagr8

Quote:


> Originally Posted by *boredgunner*
> 
> Not to mention, just with my one GTX 1080 I can run many of the best games of various genres at 4k and 120+ FPS. I don't limit myself to only playing the most recent games, which are usually disappointing and lackluster at best.


Exactly.

I see it mentioned all the time and it drives me crazy.


----------



## Millillion

What's with all the people saying it needs to be bigger? I barely have room for a 27" and as long as I can deal with/eliminate scaling issues, I'd gladly take a monitor like this once they're not so expensive. And maybe even at a smaller size.


----------



## Benny89

Quote:


> Originally Posted by *Millillion*
> 
> What's with all the people saying it needs to be bigger? I barely have room for a 27" and as long as I can deal with/eliminate scaling issues, I'd gladly take a monitor like this once they're not so expensive. And maybe even at a smaller size.


Ever heard about "preferences?". Some people love to play on big displays, its adds immersion, it sucks you into the game world much better, it's amazing for any story-telling game. I would have no problems gaming on 45-55 inch gaming monitor. Hell 32" is absolutely minimum I would upgrde to now. I would have no problems sitting 1m away from anything up to 55 inch.

Preferences. I prefer maximum immersion, don't care about "efficiency", FOV etc. I love big screens.


----------



## boredgunner

Quote:


> Originally Posted by *Benny89*
> 
> Ever heard about "preferences?". Some people love to play on big displays, its adds immersion, it sucks you into the game world much better, it's amazing for any story-telling game. I would have no problems gaming on 45-55 inch gaming monitor. Hell 32" is absolutely minimum I would upgrde to now. I would have no problems sitting 1m away from anything up to 55 inch.
> 
> Preferences. I prefer maximum immersion, don't care about "efficiency", FOV etc. I love big screens.


FOV and monitor size relation is important for immersion for most people. As long as I have the room space, I can make any sized screen work for me. I don't have the room space though, so ~40" is my max.


----------



## Millillion

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Millillion*
> 
> What's with all the people saying it needs to be bigger? I barely have room for a 27" and as long as I can deal with/eliminate scaling issues, I'd gladly take a monitor like this once they're not so expensive. And maybe even at a smaller size.
> 
> 
> 
> Ever heard about "preferences?". Some people love to play on big displays, its adds immersion, it sucks you into the game world much better, it's amazing for any story-telling game. I would have no problems gaming on 45-55 inch gaming monitor. Hell 32" is absolutely minimum I would upgrde to now. I would have no problems sitting 1m away from anything up to 55 inch.
> 
> Preferences. I prefer maximum immersion, don't care about "efficiency", FOV etc. I love big screens.
Click to expand...

I understand preferences, but it just seemed like a lot more people wanting massive screens than I expected. I guess my perception is a bit skewed though, since I guess I sit closer to my monitor when gaming than other people. ~1 meter is my "leaning back, casually browsing the internet" distance.


----------



## CallsignVega

Quote:


> Originally Posted by *Benny89*
> 
> I just read that HDMI 2.1 will probably implemented in 2018. That makes me think it might be better to wait till 2018 to grab OLED with HDMI 2.1 and be able to go 4K 120 fps.
> 
> Also HDMI 2.1 will implement VRR Game Mode, which will basicelly be G-Sync/Free-Sync for TVs.
> 
> What do you guys think? Its "only" an year. Not that long in tech world.


I'd be extremely surprised if HDMI 2.1 at the full 48 Gbps makes it into 2018 TV sets. I'm thinking 2019 at the earliest. Producing new transmission controller chips is one of the slowest things in the technology sector.


----------



## dVeLoPe

when does this come out?

i just bought an Asus ROG PG248Q but have 15 days to return it and save my $$ towards this one


----------



## Baasha

Quote:


> Originally Posted by *dVeLoPe*
> 
> when does this come out?
> 
> i just bought an Asus ROG PG248Q but have 15 days to return it and save my $$ towards this one


This is the main issue with Asus - they announce stuff and it takes a LONG time for it to hit the market. Remember the Asus RoG Swift PG278Q release?









As I've said before, I'd be very surprised if this monitor hits the stores before September this year (at the earliest). Having said that, this monitor seems to hit all the right spots for a 'gaming' panel despite being only 4K.

It would be interesting to see if we can hit 144fps @ 4K with the current hardware as Volta is not set to be released until Q4.


----------



## t1337dude

Quote:


> Originally Posted by *Benny89*
> 
> I just read that HDMI 2.1 will probably implemented in 2018. That makes me think it might be better to wait till 2018 to grab OLED with HDMI 2.1 and be able to go 4K 120 fps.
> 
> Also HDMI 2.1 will implement VRR Game Mode, which will basicelly be G-Sync/Free-Sync for TVs.
> 
> What do you guys think? Its "only" an year. Not that long in tech world.


Not a bad idea at all. I was thinking about it myself, but I have 2 caveats.

1) I've had my eye on OLED displays for a long time. I feel silly romanticizing a display technology, but posessing an OLED screen has been a "holy grail" of sorts and since I have the funds now, I'd rather not wait too much longer. If HDMI 2.1 is all it's hyped up to be, I'll sell my display and move on up.

2) I've read a lot of speculation that stuff might not be finalized for HDMI 2.1 in time for next year's displays.

Waiting on a PC monitor doesn't seem like a bad idea either. The PG27UQ looks fantastic but it's very pricey for what it offers. Hopefully in a year or two we can get something similar for a much more competitive price.


----------



## Benny89

Sorry, double post.


----------



## Benny89

Quote:


> Originally Posted by *t1337dude*
> 
> Not a bad idea at all. I was thinking about it myself, but I have 2 caveats.
> 
> 1) I've had my eye on OLED displays for a long time. I feel silly romanticizing a display technology, but posessing an OLED screen has been a "holy grail" of sorts and since I have the funds now, I'd rather not wait too much longer. If HDMI 2.1 is all it's hyped up to be, I'll sell my display and move on up.
> 
> 2) I've read a lot of speculation that stuff might not be finalized for HDMI 2.1 in time for next year's displays.
> 
> Waiting on a PC monitor doesn't seem like a bad idea either. The PG27UQ looks fantastic but it's very pricey for what it offers. Hopefully in a year or two we can get something similar for a much more competitive price.


You have valid points. I just feel so limited right now in terms of big screen for gaming. I can either go for XB321HK and have 4K Gsync with 32" screen but with only 60Hz while waiting for worth upgrade or go straight for HDMI 2.0 OLED 55 inch, but having to deal without G-Sync and risking having to upgrade to OLEDs 2.1 HDMI in 2018. I wanted to buy OLED to be future proved for at least 2-3 years. But HDMI 2.1 is just around corner....

Being big screen freak is quite troublesome...


----------



## CallsignVega

My best guess is that the 2019 LG OLED's will have HDMI 2.1 and HFR:

http://www.flatpanelshd.com/news.php?subaction=showfull&id=1473185035

OLED at 120+ FPS is going to be epic.

2018 OLED's are 11 months away, ain't going to happen IMO.


----------



## Benny89

Quote:


> Originally Posted by *CallsignVega*
> 
> My best guess is that the 2019 LG OLED's will have HDMI 2.1 and HFR:
> 
> http://www.flatpanelshd.com/news.php?subaction=showfull&id=1473185035
> 
> OLED at 120+ FPS is going to be epic.
> 
> 2018 OLED's are 11 months away, ain't going to happen IMO.


Ultimate gaming display for anything else than competitive games. 120 fps, 4-8k, OLED, VRR Game Mode, Dynamic HDR... and at least 55 inch screen. God... I need tissue....


----------



## mmms

If u have the chance to choose between 2 monitors , Whichever you prefer for best black levels and contrast ?

1) IPS + FALD + 384 dimming zones .

2) VA + FALD + 128 dimming zones .


----------



## aberrero

Quote:


> Originally Posted by *CallsignVega*
> 
> You are going to love it. OLED image quality for gaming is quite jaw dropping.


I'm watching Planet Earth II in 4K on mine right now and it is beyond incredible. I don't expect we will have a meaningful improvement in IQ for another 5 years at least.


----------



## pez

Quote:


> Originally Posted by *dboythagr8*
> 
> I don't understand why folks keep saying this?
> 
> You're not going to need a "minimum" of two Ti's for this monitor. Just because it's 144hz doesn't necessarily mean everything has to run at 144fps. There will be a gsync sweet spot for fps just as there is for any other gsync monitor. People can adjust settings the same way they do on current 1440p/144hz monitors.
> 
> I just don't want people reading things like this and thinking "well, i'm on a 1070, guess I can't buy and enjoy this monitor", because that's not true.


Quote:


> Originally Posted by *boredgunner*
> 
> Not to mention, just with my one GTX 1080 I can run many of the best games of various genres at 4k and 120+ FPS. I don't limit myself to only playing the most recent games, which are usually disappointing and lackluster at best.


Quote:


> Originally Posted by *dboythagr8*
> 
> Exactly.
> 
> I see it mentioned all the time and it drives me crazy.


It's a statement that needs to be said with a big--scratch that--HUGE asterisk besides it. Because I can super-sample old games to 4x my resolution on a single TXP and still maintain 100+ FPS, but I can't take BF1 and DSR it to 2x and expect the same (not at full tilt, anyways).


----------



## Benny89

I wonder what is happening with QLED technology... I mean I heard in last year that QLED will be better than QLED because its cheaper and can get to monitors first before OLED.

Didn't hear much about QLED since then....


----------



## dboythagr8

Quote:


> Originally Posted by *pez*
> 
> It's a statement that needs to be said with a big--scratch that--HUGE asterisk besides it. Because I can super-sample old games to 4x my resolution on a single TXP and still maintain 100+ FPS, but I can't take BF1 and DSR it to 2x and expect the same (not at full tilt, anyways).


My main point is you don't need two 1080Tis to enjoy HDR for example. You don't need two 1080Tis to play in 4k. Just because the monitor has a refresh rate of 144hz, doesn't mean you can only game at 144fps. If that's the route you want to take that's fine, but blanket statements that read as if only the best of the best can pair with this monitor are misleading.


----------



## boredgunner

Quote:


> Originally Posted by *Benny89*
> 
> I wonder what is happening with QLED technology... I mean I heard in last year that QLED will be better than QLED because its cheaper and can get to monitors first before OLED.
> 
> Didn't hear much about QLED since then....


I wager the LED technology itself is still in development, and it's not close to being ready to be put in a display yet.


----------



## Benny89

Quote:


> Originally Posted by *boredgunner*
> 
> I wager the LED technology itself is still in development, and it's not close to being ready to be put in a display yet.


But how is QLED vs OLED?


----------



## boredgunner

Quote:


> Originally Posted by *Benny89*
> 
> But how is QLED vs OLED?


Too early to say. The goal of true QLED (not Samsung's quantum dot LED-LCD) is presumably greater color space/volume and brightness over OLED.


----------



## pez

Quote:


> Originally Posted by *dboythagr8*
> 
> My main point is you don't need two 1080Tis to enjoy HDR for example. You don't need two 1080Tis to play in 4k. Just because the monitor has a refresh rate of 144hz, doesn't mean you can only game at 144fps. If that's the route you want to take that's fine, but blanket statements that read as if only the best of the best can pair with this monitor are misleading.


Yeah, I get that. I hate seeing the responses of either side when there's no background included. I've seen the statement, 'I can run 4k60FPS in newer games, so I'm happy', in the GTX 1070, 1080 and TXP threads since their respective launches. It's a slippery slope and a bold claim to make sometimes when people don't state what game, settings, etc. they're playing at.

As for 144hz, I don't think anyone should expect to try and run everything at 144hz. Even Tis will struggle to run everything at that framerate.


----------



## CallsignVega

TFTCentral seems to have received a panel production date of July. Usually it takes ~3-4 months or so from panel production start to seeing monitors in stores. Looks like a Q4 release.

http://www.tftcentral.co.uk/news_archive/37.htm#auo_samsung_roadmaps


----------



## Mini0510

Quote:


> Originally Posted by *CallsignVega*
> 
> TFTCentral seems to have received a panel production date of July. Usually it takes ~3-4 months or so from panel production start to seeing monitors in stores. Looks like a Q4 release.
> 
> http://www.tftcentral.co.uk/news_archive/37.htm#auo_samsung_roadmaps


and more like 2018 Q1 availability right? if not production issues.


----------



## CallsignVega

Quote:


> Originally Posted by *Mini0510*
> 
> and more like 2018 Q1 availability right? if not production issues.


It all depends on if they have any production difficulties, their quality control process and how many they will stock in inventory before selling to OEM's. If it goes well, I'd say Q4. If not, Q1 or even Q2 2018.

A new LCD panel, with a new FALD back-light, with a brand new DP 1.4 G-Sync chip-set. Bound to run into difficulties.


----------



## Benny89

Quote:


> Originally Posted by *CallsignVega*
> 
> their quality control process


I would not count on that. CQ will be non-existing as always....


----------



## Baasha

Quote:


> Originally Posted by *CallsignVega*
> 
> TFTCentral seems to have received a panel production date of July. Usually it takes ~3-4 months or so from panel production start to seeing monitors in stores. *Looks like a Q4 release*.
> 
> http://www.tftcentral.co.uk/news_archive/37.htm#auo_samsung_roadmaps


I want a cookie.

I called it the day they announced this monitor at CES.


----------



## Silent Scone

Should arrive around midyear.


----------



## ahmedmo1

Can ASUS quit playing here? You showed me a 32" monitor with the size I wanted but one that lacked G-Sync or a high refresh rate and then you give me this 27" little thing with all those features that were missing in the larger unit. Get back to me when you've combined the two.

Or don't... when it finally comes out, HDR content will be more available, there will be more clarity around the standard, and these monitors won't be as expensive. I have a feeling I'll be waiting till this time 2018 at the very least.


----------



## boredgunner

I hate how blur reduction has become an afterthought. We don't know whether or not the PG27UQ will have it. How about a new and improved version to go with the new G-SYNC module, taking Samsung's strobing method and more refresh rate options including 144 Hz.


----------



## KGPrime

Yeah i think i said that along time ago. It will be a shame to not improve ulmb along with this monitor. Row per row scanning would be the best way to do it and with fald it would be perfect for that.


----------



## sblantipodi

Quote:


> Originally Posted by *CallsignVega*
> 
> My best guess is that the 2019 LG OLED's will have HDMI 2.1 and HFR:
> 
> http://www.flatpanelshd.com/news.php?subaction=showfull&id=1473185035
> 
> OLED at 120+ FPS is going to be epic.
> 
> 2018 OLED's are 11 months away, ain't going to happen IMO.


why you like OLED?
OLED monitors has so many problems starting from a short lifespan to permanent ghosting effect.
why care about this tech?


----------



## CallsignVega

Quote:


> Originally Posted by *sblantipodi*
> 
> why you like OLED?
> OLED monitors has so many problems starting from a short lifespan to permanent ghosting effect.
> why care about this tech?


Is this an April fools post?


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> Is this an April fools post?


I don't think it is.


----------



## sblantipodi

Quote:


> Originally Posted by *CallsignVega*
> 
> Is this an April fools post?


no it's an informed one.


----------



## Blaze051806

once i can get one these for $300-$400 ill upgrade to 4k until then ima stay in 1080p land XD


----------



## Millillion

Quote:


> Originally Posted by *sblantipodi*
> 
> why you like OLED?
> OLED monitors has so many problems starting from a short lifespan to permanent ghosting effect.
> why care about this tech?


I care because of it's potential. There have been plenty of advancements to other types of screens, so I see no specific reason that any extant issues with OLED cannot be mostly ironed out. OLED is already so beautiful, so I want to see it get better and become more widely available.


----------



## boredgunner

Quote:


> Originally Posted by *Millillion*
> 
> I care because of it's potential. There have been plenty of advancements to other types of screens, so I see no specific reason that any extant issues with OLED cannot be mostly ironed out. OLED is already so beautiful, so I want to see it get better and become more widely available.


They already have been ironed out significantly, evident in LG's 2016 models. That guy is going off of very old information. The lifespan is not even very short relative to LCD, and by permanent ghosting he probably means burn-in which I have never seen reported on 2016 LG OLED TVs (and I've only seen temporary retention reported for the B6, not E6 or C6 even when used as PC monitors).


----------



## sblantipodi

Quote:


> Originally Posted by *boredgunner*
> 
> They already have been ironed out significantly, evident in LG's 2016 models. That guy is going off of very old information. The lifespan is not even very short relative to LCD, and by permanent ghosting he probably means burn-in which I have never seen reported on 2016 LG OLED TVs (and I've only seen temporary retention reported for the B6, not E6 or C6 even when used as PC monitors).


how could a TV get a burn in?
burn in happens on frequently shown static images, like the one seen in every OLED phones.


----------



## Vipu

Quote:


> Originally Posted by *sblantipodi*
> 
> how could a TV get a burn in?
> burn in happens on frequently shown static images, like the one seen in every OLED phones.


How not?
There is static images in tv shows, games etc


----------



## sblantipodi

Quote:


> Originally Posted by *Vipu*
> 
> How not?
> There is static images in tv shows, games etc


but not as frequent as a windows bottom bar.


----------



## boredgunner

Quote:


> Originally Posted by *sblantipodi*
> 
> how could a TV get a burn in?
> burn in happens on frequently shown static images, like the one seen in every OLED phones.


Emphasis on the "used as PC monitors" part. OLED phones have little to no retention countermeasures, but those in LG's OLED TVs have improved tremendously.


----------



## sblantipodi

Quote:


> Originally Posted by *boredgunner*
> 
> Emphasis on the "used as PC monitors" part. OLED phones have little to no retention countermeasures, but those in LG's OLED TVs have improved tremendously.


improved does not mean, fixed.


----------



## boredgunner

Quote:


> Originally Posted by *sblantipodi*
> 
> improved does not mean, fixed.


It may be fixed, for all we know. Neither of us have owned a 2016 LG OLED TV clearly, but I have yet to see one report of even temporary retention on the LG C6 or E6 when used as PC monitors, even on sites like avs forum. A friend of mine owns the C6 and uses it primarily as a monitor, no retention at all.


----------



## l88bastar

Quote:


> Originally Posted by *boredgunner*
> 
> It may be fixed, for all we know. Neither of us have owned a 2016 LG OLED TV clearly, but I have yet to see one report of even temporary retention on the LG C6 or E6 when used as PC monitors, even on sites like avs forum. A friend of mine owns the C6 and uses it primarily as a monitor, no retention at all.


I got two 2016 C6s.... no burn in and trust me they have seen plenty of taskbars and PRON


----------



## ahmedmo1

Quote:


> Originally Posted by *Blaze051806*
> 
> once i can get one these for $300-$400 ill upgrade to 4k until then ima stay in 1080p land XD


You're using a 1080 with a 55" 1080p TV?


----------



## Blaze051806

Quote:


> Originally Posted by *ahmedmo1*
> 
> You're using a 1080 with a 55" 1080p TV?


My rig is insane overkill. My 1080 is water cooled and oc to 2ghz lol and my 7700k is water at 5ghz and yes I play on a 55" and a 42" in a diff room on 1080p lol I mostly play sc2 and Stellaris hahahah


----------



## ahmedmo1

Quote:


> Originally Posted by *Blaze051806*
> 
> My rig is insane overkill. My 1080 is water cooled and oc to 2ghz lol and my 7700k is water at 5ghz and yes I play on a 55" and a 42" in a diff room on 1080p lol I mostly play sc2 and Stellaris hahahah


You could have saved yourself ~$300, waited a bit, and gotten a 27" 4K monitor @60Hz or a 1440p @120Hz.

A 1080 @ 1080p provides the exact same experience as a 1070 or a 1060 6gb for that matter.

In other words, if someone had swapped out your 1080 for a 1070 and you didn't have an FPS counter, you wouldn't be able to tell @ that resolution as I'm guessing its locked to 60Hz.

That's not even taking into account response time, input lag, color accuracy, etc.

I... just don't get it.


----------



## Blaze051806

Quote:


> Originally Posted by *ahmedmo1*
> 
> You could have saved yourself ~$300, waited a bit, and gotten a 27" 4K monitor @60Hz or a 1440p @120Hz.
> 
> A 1080 @ 1080p provides the exact same experience as a 1070 or a 1060 6gb for that matter.
> 
> In other words, if someone had swapped out your 1080 for a 1070 and you didn't have an FPS counter, you wouldn't be able to tell @ that resolution as I'm guessing its locked to 60Hz.
> 
> That's not even taking into account response time, input lag, color accuracy, etc.
> 
> I... just don't get it.


my tvs are 2 and 3 years old. my rig is 7 months old lol

i didnt have a gaming pc for a long time last one i had was athlon 3 core and GTS 250.

but yah i built this rig to be future proof i had my last pc for quite a long time id like to get 3 years + out of my hardware.

i like the HDR and 4k tech it will be my next upgrade but like i said i wont spent over 400-500 for one. so when i can get a 32inch at that range ill upgrade and my 1080 will most likely still suite me then but understand i had a 480 and sold it for $150 and got this 1080 AIO at microcenter for $420 open boxed. so i payed $270 for the upgrade in my mind i got a good deal


----------



## andre02

Have you guys seen this video from PC Centric, it has a bunch of new products from Asus, including the PG27UQ, a couple of things we already know, and maybe some we don't , including the price confirmation at 2000 $/EUR/GBP.


----------



## bee144

Quote:


> Originally Posted by *andre02*
> 
> 
> 
> 
> 
> 
> Have you guys seen this video from PC Centric, it has a bunch of new products from Asus, including the PG27UQ, a couple of things we already know, and maybe some we don't , including the price confirmation at 2000 $/EUR/GBP.


LOL! $2,000 USD? ASUS has lost their minds. I can buy a 2017 LG 55" OLED for that price. Sure it wouldn't have Gsync but I can't imagine they'll move many units at that price. $1,200 USD was the max that I was willing to spend. Bummer.

ASUS and NVIDIA are putting a lot of R&D budget and time into a monitor that's going to be DOA due to its price.


----------



## Wishmaker

2000 euros? That made my day!


----------



## mmms

Quote:


> Originally Posted by *bee144*
> 
> LOL! $2,000 USD? ASUS has lost their minds. I can buy a 2017 LG 55" OLED for that price. Sure it wouldn't have Gsync but I can't imagine they'll move many units at that price. $1,200 USD was the max that I was willing to spend. Bummer.
> 
> ASUS and NVIDIA are putting a lot of R&D budget and time into a monitor that's going to be DOA due to its price.


I agree with you .
In my opinion , if anyone will pay $2000 for this monitor , indeed he is crazy .
I'll pay this price for gaming monitor without hesitance when i see (27/32'' - 4K - OLED - HDR10 - 144hz and G-sync ) .﻿


----------



## Astreon

I'm completely not suprised. Every one of those techs brings hype. Combinging HDR FALD GSYNC 144HZ 4K ULMB FBI USA USB USSR IOS DNA OMG FFS and other shortcuts together is BOUND to cost a fortune because they are targetting rich people who want it ALL, not gamers who want a reasonable screen for reasonable money.

Get something reasonable and forget this gaming junk for 2k $.

I'd love to buy an OLED 25-27 inch 80-100hz screen, I don't need the rest of the junk tech for 9001$, but nope, can't get it.


----------



## boredgunner

I agree that the monitor will be a failure at $2000. Enthusiast PC gamers spend far less on displays than those with high end home theater setups, so this monitor at $2000 will just prove too expensive I believe. But who knows, enthusiast PC gamers did go from spending $400-450 to $800 on a monitor rather quickly.

Acer usually price matches ASUS, so I'm guessing the XB272-HDR will have the same price. AOC's 32" variant should release half a year later, their prices are usually lower so we'll see what they charge...


----------



## Millillion

Especially seeing the price, this strikes me as a "look what we can do" and "we're the best" type product. Maybe not exactly meant to be a big commercial success directly.


----------



## l88bastar

I predict they sell out within 5 minutes after launch.

Yall underestimate the top 20% of the enthusiast market.


----------



## CallsignVega

Quote:


> Originally Posted by *andre02*
> 
> 
> 
> 
> 
> 
> Have you guys seen this video from PC Centric, it has a bunch of new products from Asus, including the PG27UQ, a couple of things we already know, and maybe some we don't , including the price confirmation at 2000 $/EUR/GBP.


Interesting, the Asus guy quoted a static contrast ratio of 20,000:1. Love me some FALD.
Quote:


> Originally Posted by *l88bastar*
> 
> I predict they sell out within 5 minutes after launch.
> 
> Yall underestimate the top 20% of the enthusiast market.


Yup, I will be F5'ing this bad boy. I'd even be buying it if it cost more (don't tell Asus that). 144Hz Quantum Dot 4K 1000-nit HDR-FALD, G_Sync for less than the cost of one of my airline tickets to Australia? A steal for a gaming display that obliterates all others. But then again I was also going to drop $5K on the Dell 120 Hz 4K OLED if it were to release.


----------



## mmms

Quote:


> Originally Posted by *CallsignVega*
> 
> Interesting, the Asus guy quoted a static contrast ratio of 20,000:1. Love me some FALD.
> Yup, I will be F5'ing this bad boy. I'd even be buying it if it cost more (don't tell Asus that). 144Hz Quantum Dot 4K 1000-nit HDR-FALD, G_Sync for less than the cost of one of my airline tickets to Australia? A steal for a gaming display that obliterates all others. But then again I was also going to drop $5K on the Dell 120 Hz 4K OLED if it were to release.


If you thought by your mind a bit , you will find paying $1k more when switch between LCD panels doesn't worth it for gaming monitor.

If you really ready to pay $2000 for gaming monitor , It would be better to buy OLED for the best colors and bleck levels .
Otherwise it is a waste of money for 27'' gaming monitor .

FALD is a great feature but OLED is the future . I can stay with upcoming Samsung CHG75 for $1k with pleasure until we see (27/32'' - 4K - OLED - HDR10 - 144hz and G-sync ) .﻿


----------



## Astreon

geee, I wish I was so rich that I could F5 2000$ monitors, but instead, I have to work 3 months to earn that much money. Fun, eh?









edit: 3, not 6. Mis-calculated USD to PLN ratio, lol.


----------



## QSS-5

The monitor is targeted towards owners with two titans X, they must know how many Sli owners there are from Nvidias sales. The segment has already paid $2400 for GPU power and it is the only segment that can effectively run games at [email protected] there is no mistakes in their pricing strategy. New panel from AOU is also probably produced in low volume and relatively high cost. The monitor will fly off the shelves having any lower price would just mean lost business opportunity for Asus. That is the cost of the latest tech. The price will eventually fall by next year when next gen gpus are out and expect Korean monitors for half the price also.

Note that they might not hit the 1080Ti SLi segment with this price point but it wont affect Asus sales, overall Asus know their segment and if you can't or don't want to pay $2000 for the best spec gaming monitor, well don't worry they where not targeting you in the first place. Once this monitor hits the top YouTube tech reviews it will be out of stock.


----------



## CallsignVega

Quote:


> Originally Posted by *mmms*
> 
> If you thought by your mind a bit , you will find paying $1k more when switch between LCD panels doesn't worth it for gaming monitor.
> 
> If you really ready to pay $2000 for gaming monitor , It would be better to buy OLED for the best colors and bleck levels .
> Otherwise it is a waste of money for 27'' gaming monitor .
> 
> FALD is a great feature but OLED is the future . I can stay with upcoming Samsung CHG75 for $1k with pleasure until we see (27/32'' - 4K - OLED - HDR10 - 144hz and G-sync ) .﻿


I already have 4K OLED displays. 60 Hz really doesn't cut it for fast paced games. Once they come out with 120 Hz 4K OLED, it is game over for LCD. Those may be expected in 1-2 years, so this Asus monitors we are discussing here may be my last LCD ever.

Quote:


> Originally Posted by *QSS-5*
> 
> The monitor is targeted towards owners with two titans X, they must know how many Sli owners there are from Nvidias sales. The segment has already paid $2400 for GPU power and it is the only segment that can effectively run games at [email protected] there is no mistakes in their pricing strategy. New panel from AOU is also probably produced in low volume and relatively high cost. The monitor will fly off the shelves having any lower price would just mean lost business opportunity for Asus. That is the cost of the latest tech. The price will eventually fall by next year when next gen gpus are out and expect Korean monitors for half the price also.
> 
> Note that they might not hit the 1080Ti SLi segment with this price point but it wont affect Asus sales, overall Asus know their segment and if you can't or don't want to pay $2000 for the best spec gaming monitor, well don't worry they where not targeting you in the first place. Once this monitor hits the top YouTube tech reviews it will be out of stock.


Yes, $2K isn't that outrageous considering two GPU's just to run it properly cost $1600 (1080Ti's). Companies also do market research and I bet you Asus didn't just pull $2K out of their behind.


----------



## Astreon

My only problem is that there's no middle ground. either it's that FALD HDR Gsync stuff one (like me) may not want, or 60hz of nothingness.

What happened to middle ground monitors? what's wrong with 80-100hz and no sync, no hdr, no fald (but good QC?)


----------



## mmms

Quote:


> Originally Posted by *CallsignVega*
> 
> I already have 4K OLED displays. 60 Hz really doesn't cut it for fast paced games. Once they come out with 120 Hz 4K OLED, it is game over for LCD. Those may be expected in 1-2 years, so this Asus monitors we are discussing here may be my last LCD ever.


Surely , this Asus and Acer are the best LCD for gaming monitors we've seen .

I'll wait upcoming 120 Hz / 144 HZ 4K OLED HDR10 G-sync with lower input lag ( 1-4ms ) until they come out with reasonable price .


----------



## boredgunner

Quote:


> Originally Posted by *Astreon*
> 
> My only problem is that there's no middle ground. either it's that FALD HDR Gsync stuff one (like me) may not want, or 60hz of nothingness.
> 
> What happened to middle ground monitors? what's wrong with 80-100hz and no sync, no hdr, no fald (but good QC?)


The Samsung 1440p 144 Hz VA monitors will be a middle ground of sorts, equipped with HDR and quantum dot technology. They better not have locked brightness with strobing enabled, that is such a mind boggling limitation of he CFG70. If they do have that, then I'm not buying them either.
Quote:


> Originally Posted by *l88bastar*
> 
> I predict they sell out within 5 minutes after launch.
> 
> Yall underestimate the top 20% of the enthusiast market.


All of those sales will be to CallsignVega.


----------



## Astreon

The only "new" screen that looks remotely interesting to me is the SHG50. It's bound to cost ridiculous amount of money because of the HDR stuff, though. HDR = pricemagnet.

however, Samsung is the worst panel manufacturer out there - even worse than AUO and miles behind LG. I can't stomach another 9001 panel lottery attempts.


----------



## Baasha

Wonder if they will time this display with the next gen GPUs since there is no way 2x 1080 Ti or 2x Titan XP can do 4K @ 144Hz.


----------



## Astreon

I'm pretty sure a 1080Ti would do fine with 4K 144hz in Grim dawn







I however totally failed to notice the difference even between 60 and 144 in that game. Oh well.

but yeah, 144hz in 4K seems a bit pointless at the moment. but - you can always use 80-100 fps range in many games. jUst because the display maxes out at 144 doesn't mean you have to aim for maximum (which is technically impossible in games like Watch Dogs 2 even with Titan 3-way, IIRC...







)

Other than that, I am still pissed about little to no middle ground. There are no 100 hz displays (in 16:9, of course -there's plenty in UW but that's another story) and 120hz is EXTINCT. It's either 60, or 144 (or 240 for the TNs). There! But, many non-FPS players who doesn't need/want 144 would still like 80-100 hz monitors. Big, fat NOPE, though - they won't get it, unless they OC somehting like Dell u2515h (or the Korean monitors of course, but these days, buying a monitor without the possibility to return it seems madness to me, as dead pixels are spread far and wide).


----------



## CallsignVega

Two 1080Ti's or Titan-XP's should be able to hit the magical 100 FPS mark or higher in 4K. Especially since you will hardly need any AA at that 163 PPI for gaming. Even hitting 100 FPS will be very sweet on this display.

EDIT: Smoking crack.


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> Two 1080Ti's or Titan-XP's should be able to hit the magical 100 FPS mark or higher in 4K. Especially since you will hardly need any AA at that 163 PPI for gaming. Even hitting 100 FPS will be very sweet on this display.
> 
> Interesting fact, 4K at 144 FPS is even more demanding than a [email protected] FPS.


Yeah, 100 FPS + variable refresh rate is a fine experience. I hope this monitor at least has a reworked version of ULMB as discussed earlier, with an improved scanning method to avoid crosstalk like on Samsung's CFG70. With such a strobing implementation, many of you will be able to play games you like at 4k 100 FPS with strobing and FALD.









I would get to run many of my favorite games at 144 FPS, but I refuse to pay $2000 or anywhere near that for a monitor (or any single component).


----------



## bee144

Quote:


> Originally Posted by *QSS-5*
> 
> The monitor is targeted towards owners with two titans X, they must know how many Sli owners there are from Nvidias sales. The segment has already paid $2400 for GPU power and it is the only segment that can effectively run games at [email protected] there is no mistakes in their pricing strategy. New panel from AOU is also probably produced in low volume and relatively high cost. The monitor will fly off the shelves having any lower price would just mean lost business opportunity for Asus. That is the cost of the latest tech. The price will eventually fall by next year when next gen gpus are out and expect Korean monitors for half the price also.
> 
> Note that they might not hit the 1080Ti SLi segment with this price point but it wont affect Asus sales, overall Asus know their segment and if you can't or don't want to pay $2000 for the best spec gaming monitor, well don't worry they where not targeting you in the first place. Once this monitor hits the top YouTube tech reviews it will be out of stock.


I have two Titan Xp under a custom EK loop and even I'm put off by the price. I've got a large budget for things like these but $2k is just stupid. Price will be even higher after taxes.


----------



## Jbravo33

Quote:


> Originally Posted by *Baasha*
> 
> Wonder if they will time this display with the next gen GPUs since there is no way 2x 1080 Ti or 2x Titan XP can do 4K @ 144Hz.


This is stock 1080ti's under water. I think it was handle this monitor just fine. Even if it's around 100fps I'm sold. Just went thru a monitor testing frenzy and I'm not sold on anything that's out. I'll wait for this. 2g's tho


----------



## l88bastar

$2,000 is DIRT CHEAP for what you are getting!

Back in 2006 $2,200 got you this crap!

11 years later and 10% cheaper we get this vastly superior display......and yall got your panties in a bunch over its price????
I don't understand why there is sticker shock over the high end items of ANY HOBBY. You gotta pay to PLAY









https://postimg.org/image/xlyjxz0r9/post images


----------



## Jbravo33

Quote:


> Originally Posted by *l88bastar*
> 
> $2,000 is DIRT CHEAP for what you are getting!
> 
> Back in 2006 $2,200 got you this crap!
> 
> 11 years later and 10% cheaper we get this vastly superior display......and yall got your panties in a bunch over its price????
> I don't understand why there is sticker shock over the high end items of ANY HOBBY. You gotta pay to PLAY
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://postimg.org/image/xlyjxz0r9/post images


Lol your right I paid $6500 in 2001 for a Sony xbr kdf or whatever hell model rear projection LCD and now it's just a massive pile of junk. For some reason I haven't been able to let it go for sentimental values and the fact that I paid that much for it. I tried giving it away recently to a family member there reply. Too big. Haha things huge but it's time for it to go.


----------



## boredgunner

Quote:


> Originally Posted by *l88bastar*
> 
> $2,000 is DIRT CHEAP for what you are getting!
> 
> Back in 2006 $2,200 got you this crap!
> 
> 11 years later and 10% cheaper we get this vastly superior display......and yall got your panties in a bunch over its price????
> I don't understand why there is sticker shock over the high end items of ANY HOBBY. You gotta pay to PLAY
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://postimg.org/image/xlyjxz0r9/post images


Was that even a consumer monitor? I believe it was a professional display designed for color critical work, a market that consistently has $1500-2000+ monitors.

Still, I know flagship CRTs were expensive, like the Sony FW900. Most PC gamers only dreamed of them but a few bought them.


----------



## CallsignVega

Quote:


> Originally Posted by *boredgunner*
> 
> Was that even a consumer monitor? I believe it was a professional display designed for color critical work, a market that consistently has $1500-2000+ monitors.
> 
> Still, I know flagship CRTs were expensive, like the Sony FW900. Most PC gamers only dreamed of them but a few bought them.


No, that was a regular high end home monitor of it's day. I had three of them in portrait surround.









Also, the FW900 CRT was $2500 in 2002. That's like $3500 today.


----------



## Baasha

Oh goodie, so my 4-Way 1080 Ti should crush this monitor while the Titan XP 4 Way SLI will be getting a workout on the 8K rig.


----------



## CallsignVega

Quote:


> Originally Posted by *Baasha*
> 
> Oh goodie, so my 4-Way 1080 Ti should crush this monitor while the Titan XP 4 Way SLI will be getting a workout on the 8K rig.


Woops, had a brain fart.

4K @ 144 Hz = 1.19 Billion pixels per second.

8K @ 60 Hz = 1.99 Billion pixels per second.

8K @ 60 Hz is 67% harder to run than 4K @ 144 Hz.


----------



## l88bastar

Quote:


> Originally Posted by *boredgunner*
> 
> Was that even a consumer monitor? I believe it was a professional display designed for color critical work, a market that consistently has $1500-2000+ monitors.
> 
> Still, I know flagship CRTs were expensive, like the Sony FW900. Most PC gamers only dreamed of them but a few bought them.


Yes, but this is what I am trying to explain to yall.... the PC gaming consumer market has reached a maturity to where the very top demands finer and better displays and is willing to pay more for them just like a professional is willing to spend more on a higher grade professional display.

PC gaming is no longer the denizen of "fat, pasty, virgin, pickle breath gaming nerds" and has been mainstream for quite sometime.....when you see "B" list actors promoting and playing the newest "8th" sequel of a "popular" game...ugghhh...well then you know that your little hobby has gone full delta and the Jersey shore idiots with lambos will be arriving any day now









Simply put, if you cannot afford the hardware to run the display, then you cannot afford the display. Unless of course you are stubborn on principal or unwilling to save or sacrifice because you see more value elsewhere.

I have an LG 55" OLED , a CF791 and a S2417DG all sitting on my desk right now..... I would gladly trade them all for one PG27UQ....and "funnily" enough their resale value would put me close to the 2k price tag.... I recognize that the PG27UQ will give me the best combination of all three of my displays and am willing to jump on board the luxury hype train.....but to each their own


----------



## pez

Quote:


> Originally Posted by *ahmedmo1*
> 
> You could have saved yourself ~$300, waited a bit, and gotten a 27" 4K monitor @60Hz or a 1440p @120Hz.
> 
> A 1080 @ 1080p provides the exact same experience as a 1070 or a 1060 6gb for that matter.
> 
> In other words, if someone had swapped out your 1080 for a 1070 and you didn't have an FPS counter, you wouldn't be able to tell @ that resolution as I'm guessing its locked to 60Hz.
> 
> That's not even taking into account response time, input lag, color accuracy, etc.
> 
> I... just don't get it.


Because the 1070 or 1060 is going to fall to its' knees when he upgrades to 4K. The 1070 is great for 1440p and even some high refresh 1440p, but it's a very 'high-compromise' card for 4K.
Quote:


> Originally Posted by *boredgunner*
> 
> I agree that the monitor will be a failure at $2000. Enthusiast PC gamers spend far less on displays than those with high end home theater setups, so this monitor at $2000 will just prove too expensive I believe. But who knows, enthusiast PC gamers did go from spending $400-450 to $800 on a monitor rather quickly.
> 
> Acer usually price matches ASUS, so I'm guessing the XB272-HDR will have the same price. AOC's 32" variant should release half a year later, their prices are usually lower so we'll see what they charge...


The x34 at it's price tag was a big deal for me. I didn't think twice about the Titan, but the display....







. However, I keep displays for years at a time. I can't say that I wouldn't be tempted by a 4K 21:9 at 144hz or so in the future.....


----------



## Baasha

Quote:


> Originally Posted by *CallsignVega*
> 
> Woops, had a brain fart.
> 
> 4K @ 144 Hz = 1.19 Billion pixels per second.
> 
> 8K @ 60 Hz = 1.99 Billion pixels per second.
> 
> 8K @ 60 Hz is 67% harder to run than 4K @ 144 Hz.


Was about to say...


----------



## Vipu

Quote:


> Originally Posted by *Baasha*
> 
> Wonder if they will time this display with the next gen GPUs since there is no way 2x 1080 Ti or 2x Titan XP can do 4K @ 144Hz.


2x 1080 ti can easily do 4k 144hz...
Even 1 can if you strip settings down a bit more.


----------



## lever2stacks

Quote:


> Originally Posted by *Vipu*
> 
> 2x 1080 ti can easily do 4k 144hz...
> Even 1 can if you strip settings down a bit more.


2x 1080 ti will not do 144fps in 4k.


----------



## Vipu

Quote:


> Originally Posted by *lever2stacks*
> 
> 2x 1080 ti will not do 144fps in 4k.


Ok I cant promise 144 but 100+





Then you can disable some fps hungry settings that dont do anything and you easily get 100 at least.


----------



## Aristotelian

Quote:


> Originally Posted by *l88bastar*
> 
> I predict they sell out within 5 minutes after launch.
> 
> Yall underestimate the top 20% of the enthusiast market.


Agreed. The people in the market for this monitor aren't the people who balk at spending for the high end. Everything in life is becoming like this - I bought my wife a EUR 3000 sewing machine recently, and there are plenty of people that by Pfaff or Bernina sewing machines out there. Just as there are people saying EUR 2000 (including VAT) for this monitor is probably where they are going with this.

I'm not saying I'm happy to pay more for the same thing - not at all; but if this monitor is without peer in its market space then of course there is going to be a premium on the item. And they'll sell out rapidly.


----------



## Silent Scone

Quote:


> Originally Posted by *Aristotelian*
> 
> Agreed. The people in the market for this monitor aren't the people who balk at spending for the high end. Everything in life is becoming like this - I bought my wife a EUR 3000 sewing machine recently, and there are plenty of people that by Pfaff or Bernina sewing machines out there. Just as there are people saying EUR 2000 (including VAT) for this monitor is probably where they are going with this.
> 
> I'm not saying I'm happy to pay more for the same thing - not at all; but if this monitor is without peer in its market space then of course there is going to be a premium on the item. And they'll sell out rapidly.


It's not quite as black and white as that, though...

Some people don't _want_ 27" panels...

Waiting on the big guns.


----------



## chrisnyc75

Everybody calm down. They're just capitalizing on the brief sliver of time in which this monitor has no real equal on the market. They know perfectly well the price is ridiculous, but because it CURRENTLY has no equal, they can charge whatever they want and it will still sell. The moment a competitor comes out with something equivalent, the price will drop precipitously.

See also: Corsair K70 RGB keyboards - at initial release, they sold for well over $200. Then Razer, CoolerMaster, and a slew of competitors brought equivalent products to market, and now you can pick one up for roughly half the debut price. This is exactly what is happening with this monitor. If you find $2000 an obscene price for this product, wait a year and you'll be able to pick one up for half that.


----------



## l88bastar

Quote:


> Originally Posted by *chrisnyc75*
> 
> Everybody calm down. They're just capitalizing on the brief sliver of time in which this monitor has no real equal on the market. They know perfectly well the price is ridiculous, but because it CURRENTLY has no equal, they can charge whatever they want and it will still sell. The moment a competitor comes out with something equivalent, the price will drop precipitously.
> 
> See also: Corsair K70 RGB keyboards - at initial release, they sold for well over $200. Then Razer, CoolerMaster, and a slew of competitors brought equivalent products to market, and now you can pick one up for roughly half the debut price. This is exactly what is happening with this monitor. If you find $2000 an obscene price for this product, wait a year and you'll be able to pick one up for half that.


Or wait ten years and you can find it for free!!


----------



## WorldExclusive

I just don't see the reason to buy over-priced game centric PC parts. The games that are coming out today don't support SLI to drive this monitor, the visuals haven't progressed and the overall game quality and performance is poor.

4K is about productivity, having multiple windows open in a grid. My 40" works well with 4K. But they think rolling out some 27" monitor with loaded features, when used for productivity, you can barely see the text without scaling or sitting right up to it. A single purpose built monitor doesn't get my money.


----------



## SightUp

HDR, meaning it's not IPS? Or am I missing something? If so, this means it won't have backlight bleed? When is it released?!


----------



## boredgunner

Quote:


> Originally Posted by *SightUp*
> 
> HDR, meaning it's not IPS? Or am I missing something? If so, this means it won't have backlight bleed? When is it released?!


It is IPS, but it uses full array local dimming so the backlight is dimmed or brightened dynamically based on content, across 384 zones around the screen. Since the backlight is back mounted rather than edge mounted, even with FALD disabled I'd expect no bleeding.


----------



## Baasha

Quote:


> Originally Posted by *Vipu*
> 
> 2x 1080 ti can easily do 4k 144hz...
> Even 1 can if you strip settings down a bit more.


Quote:


> Originally Posted by *lever2stacks*
> 
> 2x 1080 ti will not do 144fps in 4k.


Quote:


> Originally Posted by *Vipu*
> 
> Ok I cant promise 144 but 100+
> 
> Then you can disable some fps hungry settings that dont do anything and you easily get 100 at least.


hahahhahahha.... in other words, 2x 1080 Ti will not be able to do 4K @ 144hz. Also, turning down settings does not count - what resolution/refresh rate can you game at with ALL settings maxed out (sans AA)? That is the question. As far as that is concerned - 4K @ 144Hz is a tall ask even for 4x 1080 Ti.

2x 1080 Ti are not enough to get 60FPS in GTA V with NaturalVision 2.2 maxed out @ 4K:


----------



## boredgunner

Quote:


> Originally Posted by *Baasha*
> 
> hahahhahahha.... in other words, 2x 1080 Ti will not be able to do 4K @ 144hz. Also, turning down settings does not count - what resolution/refresh rate can you game at with ALL settings maxed out (sans AA)? That is the question. As far as that is concerned - 4K @ 144Hz is a tall ask even for 4x 1080 Ti.
> 
> 2x 1080 Ti are not enough to get 60FPS in GTA V with NaturalVision 2.2 maxed out @ 4K:


You will still be getting around 100 FPS with G-SYNC enabled, so no tearing. What's the big deal?

Also, most people will keep this monitor longer than they keep their 1080 Ti's. They will probably upgrade to two high end Volta cards, then who knows what your frame rate will be at.

Last but certainly not least, there are also pre-2015 games to consider, and also many less intensive but still great indie games which would easily run at 120+ FPS at 4k.


----------



## Astreon

Quote:


> Originally Posted by *l88bastar*
> 
> 11 years later and 10% cheaper we get this vastly superior display......and yall got your panties in a bunch over its price????
> I don't understand why there is sticker shock over the high end items of ANY HOBBY. You gotta pay to PLAY


I know this is a joke and all, but when you live in Eastern Europe and read posts like those, one starts contemplating suicide, kek.









2000$ is almost 22% of my annual income. Spending this on a monitor is out of the question - physically impossible, I'd have to starve or live on the street.

And I'm not washing dishes here, lol. I am a telecommunications engineer working for quite a big company.

Sorry about the bitterness, but it does feel terrible.


----------



## CallsignVega

That is the beauty of G-Sync. You should be able to get 100+ FPS in 98% of games with SLI 1080Ti/Titan-XP and have a wonderful experience on this monitor.

I simply hope they come out with a 32" version down the road. (unless 120 Hz OLED is out which at that point LCD is moot).


----------



## Baasha

Quote:


> Originally Posted by *boredgunner*
> 
> You will still be getting around 100 FPS with G-SYNC enabled, so no tearing. What's the big deal?
> 
> Also, most people will keep this monitor longer than they keep their 1080 Ti's. They will probably upgrade to two high end Volta cards, then who knows what your frame rate will be at.
> 
> Last but certainly not least, there are also pre-2015 games to consider, and also many less intensive but still great indie games which would easily run at 120+ FPS at 4k.


Not saying that's a bad thing necessarily. Was talking about some people who said "2x 1080 Ti will do 4K @ 144Hz easily" - that is simply not true. I speak from experience. They themselves corrected their statement and now claim 100FPS is doable - that's fine and dandy, however, it's not 4K @ 144Hz.

I can't get 4K @ 144Hz even with 4x 1080 Ti. Of course, that could be because I'm using "only" a 4960X @ 4.50Ghz on that rig but still, 4K @ 144Hz is insanely demanding. I would wager even 2x Volta Titans or 1180/Ti (or whatever) would struggle to do that in modern games.

Further, I'm mostly playing @ 120Hz ULMB mode which I find much more pleasant - it removes G-Sync but I find the gameplay much smoother IME.

I bet the CPUs are not up to the task of handling 4K @ 144Hz since it's not all GPU only.

For instance, 100FPS in 5K is doable most of the time in Battlefield 1 maxed out (even with FXAA HIGH) with 4x 1080 Ti but at 4K, the scaling drops off tremendously:


----------



## CallsignVega

Ya my worries will be how do they handle ULMB with this new monitor. It has the potential to be amazing with that bright FALD backlight, but I haven't heard a single thing about backlight strobing on this model.

Another big thing about strobing backlight is that you really never want your FPS to drop below the refresh rate. 4K @ 120 FPS min is going to be incredibly hard in newer games.


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya my worries will be how do they handle ULMB with this new monitor. It has the potential to be amazing with that bright FALD backlight, but I haven't heard a single thing about backlight strobing on this model.
> 
> Another big thing about strobing backlight is that you really never want your FPS to drop below the refresh rate. 4K @ 120 FPS min is going to be incredibly hard in newer games.


I think Samsung's new strobing method, and similar more effective methods, will be more effective at slightly lower refresh rates than 120 Hz. I haven't seen anyone complain of the CFG70's strobing at 100 Hz.


----------



## tconroy135

What are the cons to a monitor that is Full OLED?


----------



## boredgunner

Quote:


> Originally Posted by *tconroy135*
> 
> What are the cons to a monitor that is Full OLED?


OLED cons compared to LCD right now? Reduced brightness, probably more potential defects still, image retention countermeasures are necessary but they do their job extremely well on the newest OLED TVs.


----------



## Astreon

Quote:


> Originally Posted by *tconroy135*
> 
> What are the cons to a monitor that is Full OLED?


OLED does motion better, doesn't need the FALD stuff to show true Black, and costs less. yeah, I think it does. 27 inch LCD that costs as much as 65 inch OLED HDR TV sounds slightly lame. But then again, I may be wrong on the last one. a 144hz OLED HDR 27inch "gaming" screen would probably cost an outragous amount of money now.

Thing is, FALD is very expensive and OLED doesn't need it at all, so...


----------



## Vipu

Quote:


> Originally Posted by *Baasha*
> 
> hahahhahahha.... in other words, 2x 1080 Ti will not be able to do 4K @ 144hz. Also, turning down settings does not count - what resolution/refresh rate can you game at with ALL settings maxed out (sans AA)? That is the question. As far as that is concerned - 4K @ 144Hz is a tall ask even for 4x 1080 Ti.
> 
> 2x 1080 Ti are not enough to get 60FPS in GTA V with NaturalVision 2.2 maxed out @ 4K:


Well if you play for biggest e-penis to always have FULL MAXED ULTRA no matter if high settings have double the fps and no difference in anything then sure.
I always use custom settings for best fps/quality.


----------



## pez

Quote:


> Originally Posted by *WorldExclusive*
> 
> I just don't see the reason to buy over-priced game centric PC parts. The games that are coming out today don't support SLI to drive this monitor, the visuals haven't progressed and the overall game quality and performance is poor.
> 
> 4K is about productivity, having multiple windows open in a grid. My 40" works well with 4K. But they think rolling out some 27" monitor with loaded features, when used for productivity, you can barely see the text without scaling or sitting right up to it. A single purpose built monitor doesn't get my money.


The fact that even if you just had a single GPU, G-sync will make a <60FPS experience much better. You pay to play. And there's still a good portion of titles still coming out that support SLI. In fact, everything that has recently come out and that needs the extra horsepower to run 60+ FPS at 4K does scale with SLI.


----------



## Aristotelian

Quote:


> Originally Posted by *Astreon*
> 
> I know this is a joke and all, but when you live in Eastern Europe and read posts like those, one starts contemplating suicide, kek.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2000$ is almost 22% of my annual income. Spending this on a monitor is out of the question - physically impossible, I'd have to starve or live on the street.
> 
> And I'm not washing dishes here, lol. I am a telecommunications engineer working for quite a big company.
> 
> Sorry about the bitterness, but it does feel terrible.


As a European, I do understand what you mean. But the argument against X being "worth it" cannot be derived from "my personal income". I watched the Youtube video and the Asus rep says USD 2000 (explicitly) meaning this will certainly be more than EUR 2000. A USD 769 graphics card is EUR 869 or so because of VAT. So if this really is released at USD 2000 in the States then we are looking at about 2499 EUR.

Yes, that is a lot for a monitor. But the 'worth it' will depend on its competition. If there is no real competition for the monitor, then there will be a hefty markup for it. It has always been that way for premium items at specific times.
Quote:


> Originally Posted by *Silent Scone*
> 
> It's not quite as black and white as that, though...
> 
> Some people don't _want_ 27" panels...
> 
> Waiting on the big guns.


I can understand those wanting larger monitors. But wanting a larger monitor means (as you said) waiting. I personally find 27" 1440p to be too low of a density for my eyes, and I think 4k will be the sweet spot for me. Those who want larger panels are free to wait; those who think like me may have found their dream product in a computer monitor. I hope TFTCentral gets a review out as soon as possible after release.


----------



## mmms

Quote:


> Originally Posted by *boredgunner*
> 
> OLED cons compared to LCD right now? Reduced brightness, probably more potential defects still, image retention countermeasures are necessary but they do their job extremely well on the newest OLED TVs.


Quote:


> Originally Posted by *Astreon*
> 
> OLED does motion better, doesn't need the FALD stuff to show true Black, and costs less. yeah, I think it does. 27 inch LCD that costs as much as 65 inch OLED HDR TV sounds slightly lame. But then again, I may be wrong on the last one. a 144hz OLED HDR 27inch "gaming" screen would probably cost an outragous amount of money now.
> 
> Thing is, FALD is very expensive and OLED doesn't need it at all, so...


I wish , we can see 27'' 120hz / 144hz 2K / 4K OLED HDR10 and G-sync by 2020 for about $2000 - $2300 .


----------



## Astreon

I wish for a 80-100hz OLED in 27 inch 1440p for less than 1000$


----------



## mmms

Quote:


> Originally Posted by *Astreon*
> 
> I wish for a 80-100hz OLED in 27 inch 1440p for less than 1000$


LOL , i think you'll wait for another 10 years .


----------



## Astreon

personally, I think we'll never get it. It will be either cheap TN crap, business IPS or super-expensive gaming monitors with HDR FALD 4K 144HZ ULMB CCCP FBI OMG LOL and so on. OLED will fall into the last category and will be priced at 2000-3000$ because the marketing department will never allow them to sell a better display for less, even if production costs are the same.

That's why I bought a 1440p 75hz screen and called it a day.


----------



## boredgunner

I seriously doubt we will get 1440p OLED. 1440p will be phased out by the time OLED monitors come around.


----------



## Astreon

Hopefully M$ will make a proper 4K interface and scaling by then. Was I the only one who had problems in 4K windows? My biggest issues was that windows interface scaled different than some apps, and those scaled different than browsers.

In the end I'd end up with text too big in Windows, but too small in some apps - but just right in firefox. Duh. Not the most comfortable experience.

UHD seems better than 1440p for many reasons (possible 4:1 mapping and compatiblity with FHD content obtained that way ---> movies, youtube stuff, games that are very taxing @ UHD), but it seems a 2025 resolution currently, haha. Maybe it will really become mainsteam when OLED hits the consumer grade monitors


----------



## ahmedmo1

I never got all the hate with scaling. I haven't noticed many problems using a 32" 4k monitor with 150% scaling using Windows 10.


----------



## Astreon

well, I did notice many problems when running 200% (or so?) on Dell p2715q, because a single scaling percentage would not fit every situation.

And the p2715q was terribad at FHD - it displayed interpolated, blurry mess.


----------



## bee144

Windows 10 creators update pushed towards resolving some of the scaling issues. I've had the update since Friday and can confirm scaling behaves better.

I'm also a Microsoft employee so maybe I've fallen into a placebo.


----------



## SightUp

So this monitor will have IPS but for sure won't have back light bleeding? How is this the case?

Will the 1080, the non-TI version, dp port work for it?


----------



## boredgunner

Quote:


> Originally Posted by *SightUp*
> 
> So this monitor will have IPS but for sure won't have back light bleeding? How is this the case?
> 
> Will the 1080, the non-TI version, dp port work for it?


On normal monitors the backlight is installed on the edges of the panel (top and bottom it seems) making it very easy to get bleeding. On this monitor, the backlight is installed behind the panel so there shouldn't be bleeding. Also, any bleeding and IPS glow effects will be reduced in darker content with full array local dimming enabled since the backlight will dim significantly when displaying blacks and darker shades.

GTX 1080 has DisplayPort 1.4 so it can indeed drive this monitor properly. Although for playing modern games on higher settings at 4k, you'll want more than GTX 1080s.


----------



## SightUp

Do we have a release date?


----------



## bee144

Quote:


> Originally Posted by *SightUp*
> 
> Do we have a release date?


nope. Panels haven't even gone into production yet. We're looking at Q3 at the earliest.

ASUS engineer was quoted as saying the price will be 2,000 USD.


----------



## Baasha

So this showed up on Asus' site today: https://www.asus.com/Monitors/ROG-SWIFT-PG27UQ/

No date/pre-order link yet though.

Hmm.. Q2 release possibly?


----------



## Astreon

Hmm, it does not mention the FALD stuff... I thought this monitor is equipped with this technology?


----------



## boredgunner

Quote:


> Originally Posted by *Astreon*
> 
> Hmm, it does not mention the FALD stuff... I thought this monitor is equipped with this technology?


It does, now at least.

"The LED backlight is dynamically controlled across 384 zones, enabling very high contrast for richer, more natural dark scenes."


----------



## CallsignVega

Quote:


> Originally Posted by *Baasha*
> 
> So this showed up on Asus' site today: https://www.asus.com/Monitors/ROG-SWIFT-PG27UQ/
> 
> No date/pre-order link yet though.
> 
> Hmm.. Q2 release possibly?


Sweet. I am going to set up a "F5" marathon of super cluster computers at work for this display.


----------



## Astreon

I expected you to buy five of them to work in a 5 monitor setup, lol.

Anyway, I think the price is justified, guys. 2000$ makes this monitor available mostly to Americans, but I can't say that the price is wrong. It is the ultimate lcd, and while I hate LCD, this one would probably make me happy about owning it.

There will also be less issues with qc. With FALD there is no BLB. No glow too, iirc. So you just play the lottery regarding pixels and dust. Must people should be fine after 2-3 returns.

I also don't expect this one too drop in price because it can't get better. Expect it to be like gdm fw900 - forever expensive.


----------



## CallsignVega

Ya I'd imagine ASUS will do a better job of quality control on a $2000 monitor. Not one should leave the factory with pixel defects are dust. We will see.

This is basically the last hurrah for LCD IMO. Once 120 Hz OLED hits with HDMI 2.1 I will naturally be going to that. You are right, I can see this being like the last/halo product like the FW900 was to CRT's.


----------



## boredgunner

The FW900 is a good comparison for that reason. Not sure if ASUS will be any better than Acer for QC though, Acer seemed slightly better for the 27" 1440p 144 Hz.


----------



## Astreon

The only concern about OLED is the price. While the production cost of panels was actually reported to be (possibly) lower than lcd, I honestly don't believe the marketing departments of Acer, Asus, etc. Will let us buy better displays for less.

An example might be dell's (cancelled) 120hz OLED, with a suggested price of 5000$ :3


----------



## CallsignVega

Quote:


> Originally Posted by *boredgunner*
> 
> The FW900 is a good comparison for that reason. Not sure if ASUS will be any better than Acer for QC though, Acer seemed slightly better for the 27" 1440p 144 Hz.


I have a nagging suspicion that Acer gets better panels than Asus since Acer and AUO are close. Acer also gets the double-sided tape "bezel-less" versions of the same panel (IE: XB271HU) versus the version of the same panel Asus gets that has a traditional snap on bezel around the panel.
Quote:


> Originally Posted by *Astreon*
> 
> The only concern about OLED is the price. While the production cost of panels was actually reported to be (possibly) lower than lcd, I honestly don't believe the marketing departments of Acer, Asus, etc. Will let us buy better displays for less.
> 
> An example might be dell's (cancelled) 120hz OLED, with a suggested price of 5000$ :3


I would have bought that Dell OLED in a heart beat! I think the first 120 Hz OLED now will come from LG in their TV lineup. Once HDMI 2.1 chips hit the market in a year or two, there basically is zero reason why LG wouldn't have their 2018/2019 OLED TV's capable of 120 Hz 4K.

That begs the question; will Volta have HDMI 2.1 support on the video cards or do we have to wait another generation after that.


----------



## Astreon

If you care for some insider info: yes, Acer gets better panels than Asus from AU Optronics.


----------



## Jbravo33

i can live with it being 2 grand but 27" just doesnt sell me.


----------



## CallsignVega

Can anyone that speaks German translate any info that we might not already know?

A couple things from the video I've noticed:

1. The HDR vividness is amazing.
2. The display is quite deep (as expected with FALD).
3. The reflections make it look like semi-gloss which is great!


----------



## SightUp

I am happy to pay this price! As long as there aren't any defects out of the box!


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> 
> 
> 
> 
> 
> Can anyone that speaks German translate any info that we might not already know?
> 
> A couple things from the video I've noticed:
> 
> 1. The HDR vividness is amazing.
> 2. The display is quite deep (as expected with FALD).
> 3. The reflections make it look like semi-gloss which is great!


I believe it is the same coating as the 27" 1440p 144 Hz monitors. They look glossy from extreme angles, but head-on are matte, just like in the video.

@Charcharo is fluent in German, says it's just a general overview and nothing new is mentioned. The guy in the video says he saw no motion blur and felt no input lag, but nothing about actual blur reduction technology (I doubt it is enabled there because good luck running anything UE4 at 100+ FPS at 4k).


----------



## mmms

What about contrast ?


----------



## boredgunner

Quote:


> Originally Posted by *mmms*
> 
> What about contrast ?


No exact figures. To be honest I'm not even sure how FALD VA TVs measure in contrast, with local dimming enabled. I hear the zone contrast ratio on such TVs is well over 20,000:1, but never saw an actual measurement. This monitor shouldn't be much different than such TVs.


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> 
> 
> 
> 
> 
> Can anyone that speaks German translate any info that we might not already know?
> 
> A couple things from the video I've noticed:
> 
> 1. The HDR vividness is amazing.
> 2. The display is quite deep (as expected with FALD).
> 3. The reflections make it look like semi-gloss which is great!


Can I Has!


----------



## CallsignVega

Quote:


> Originally Posted by *boredgunner*
> 
> No exact figures. To be honest I'm not even sure how FALD VA TVs measure in contrast, with local dimming enabled. I hear the zone contrast ratio on such TVs is well over 20,000:1, but never saw an actual measurement. This monitor shouldn't be much different than such TVs.


Supposedly this display is 20,000:1 static. Basically wrecks all other non FALD LCD's.

Quote:


> Originally Posted by *l88bastar*
> 
> Can I Has!


Only if you are a good boy little Timmy.


----------



## Baasha

Quote:


> Originally Posted by *CallsignVega*
> 
> Sweet. I am going to set up a "F5" marathon of super cluster computers at work for this display.


lol..









That stand is hideous though - what were they thinking with that stupid light shining on the desk?









You going to do 4K Surround?









I bet 4K Surround with this beast *will* be more demanding than the 8K @ 60hz.


----------



## Malinkadink

Quote:


> Originally Posted by *Baasha*
> 
> lol..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That stand is hideous though - what were they thinking with that stupid light shining on the desk?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You going to do 4K Surround?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I bet 4K Surround with this beast *will* be more demanding than the 8K @ 60hz.


3840x2160x144x3 = 3,583,180,800

7680×4320x60 = 1,990,656,000

Oh yeah, it'll be hard to run!

But it'll still be easier to run 3 4k monitors at 60 fps than 1 8k monitor at 60 fps

3840x2160x60x3 = 1,492,992,000


----------



## CallsignVega

Quote:


> Originally Posted by *Baasha*
> 
> lol..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That stand is hideous though - what were they thinking with that stupid light shining on the desk?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You going to do 4K Surround?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I bet 4K Surround with this beast *will* be more demanding than the 8K @ 60hz.


Oh no my multi-monitor days are over. It made sense back when monitors were 1080/1440p, but not with 4K/5K/8K out there.


----------



## Recipe7

Didn't think ASUS would drop the monitor on their site this soon.

My bank account is only ready for a 1200USD monitor at this moment, unfortunately.

I've been holding on to my 1080p 144hz ASUS for quite a while now. It's going to be hard knowing many people, including some on this thread, will be enjoying the F out of this beast.


----------



## Astreon

Vega will









While I can technically afford this monitor, I don't think it would be wise. I'd rather get something that's 1440p, OLED or (sigh) FALD, and without HDR as there is very little HDR content when it comes to video games.

4K is too demanding, I only have 1080gtx, that would mean 50 fps on average in new games.

However, it seems that the new norm is all-or-nothing. You either get a super expensive monitor with all the techs, or a plain one with none of them.

The final question is: can this monitor do 1080p without interpolation.


----------



## boredgunner

Quote:


> Originally Posted by *Astreon*
> 
> Vega will
> 
> 
> 
> 
> 
> 
> 
> 
> 
> While I can technically afford this monitor, I don't think it would be wise. I'd rather get something that's 1440p, OLED or (sigh) FALD, and without HDR as there is very little HDR content when it comes to video games.
> 
> 4K is too demanding, I only have 1080gtx, that would mean 50 fps on average in new games.
> 
> However, it seems that the new norm is all-or-nothing. You either get a super expensive monitor with all the techs, or a plain one with none of them.
> 
> The final question is: can this monitor do 1080p without interpolation.


If I end up at 4k with just my GTX 1080, I'll be playing mostly older games to avoid low performance. Secretly this will be to my pleasure anyway, a break from playing disappointing new releases just for the sake of reviewing them, replaying the masterpieces of various genres.


----------



## st0necold

Quote:


> Originally Posted by *Baasha*
> 
> Not saying that's a bad thing necessarily. Was talking about some people who said "2x 1080 Ti will do 4K @ 144Hz easily" - that is simply not true. I speak from experience. They themselves corrected their statement and now claim 100FPS is doable - that's fine and dandy, however, it's not 4K @ 144Hz.
> 
> I can't get 4K @ 144Hz *even with 4x 1080 Ti*. Of course, that could be because I'm using "only" a 4960X @ 4.50Ghz on that rig but still, 4K @ 144Hz is insanely demanding. I would wager even 2x Volta Titans or 1180/Ti (or whatever) would struggle to do that in modern games.
> 
> Further, I'm mostly playing @ 120Hz ULMB mode which I find much more pleasant - it removes G-Sync but I find the gameplay much smoother IME.
> 
> I bet the CPUs are not up to the task of handling 4K @ 144Hz since it's not all GPU only.
> 
> For instance, 100FPS in 5K is doable most of the time in Battlefield 1 maxed out (even with FXAA HIGH) with 4x 1080 Ti but at 4K, the scaling drops off tremendously:


SLI isn't supported on pascal cards thats why your not getting 144hz with 4 1080ti's... custom profiles or whatever are fine it still doesn't work.


----------



## dVeLoPe

guys im in a bind here so like i have a

http://www.tftcentral.co.uk/reviews/benq_xl2410t.htm

BENQ XL2410T and i bought the PG248Q but i dont see paying 350$ for another 24hz (my old monitor is 120hz and ive been into that for like 5 years now) it doesnt overclock to 180hz

im gonna try the lottery again but if it doesnt come in at 180 i might just return and wait for this any eta? or something similar to/that beats it?


----------



## Astreon

No eta and the msrp is 2000$, so be ready for that.


----------



## BoredErica

At this point, I'd rather get Samsung monitor or a used LG OLED.
I dunno how I'd adjust to a 55inch TV for gaming though...

Quote:



> Originally Posted by *l88bastar*
> 
> $2,000 is DIRT CHEAP for what you are getting!
> 
> Back in 2006 $2,200 got you this crap!
> 
> 11 years later and 10% cheaper we get this vastly superior display......and yall got your panties in a bunch over its price????
> I don't understand why there is sticker shock over the high end items of ANY HOBBY. You gotta pay to PLAY


Emphasis on 11 years later.


----------



## guttheslayer

Quote:


> Originally Posted by *SightUp*
> 
> Do we have a release date?


You probably have to wait for 3 quarter to a full year for it to get solid release from now.

Either way what kinda hardware needed to saturate this display when playing Ghost Recon Wildland at Ultra setting?


----------



## Hawk777th

Wish they could make something you could put on your desk that doesnt scream GAMER EXTREME! I dont want a distracting red light on my monitor. I know you can turn it off but still.

Should have been larger but sure price prevented that.

Waiting for 144Hz 21:9.


----------



## l88bastar

Quote:


> Originally Posted by *Hawk777th*
> 
> Wish they could make something you could put on your desk that doesnt scream GAMER EXTREME! I dont want a distracting red light on my monitor. I know you can turn it off but still.
> 
> Should have been larger but sure price prevented that.
> 
> Waiting for 144Hz 21:9.


But it is a gaming display and that is pretty much what the market likes. Don't like the useless light, turn it off. Don't like the gamey colors of the monitor shroud & base...then paint em, or remove em


----------



## wreckless

Quote:


> Originally Posted by *Hawk777th*
> 
> Wish they could make something you could put on your desk that doesnt scream GAMER EXTREME! I dont want a distracting red light on my monitor. I know you can turn it off but still.
> 
> Should have been larger but sure price prevented that.
> 
> Waiting for 144Hz 21:9.


totally agree with not wanting the busy aesthetics.

i'm hoping dell comes out with something similar in the near future.


----------



## Astreon

By the way guys: while I still think that 2000$ is fair FOR NOW, it may no longer be fair after a year or two.

Let's put it this way: GDM FW900 was basically unrivalled for 10 years, and the first competition came in 2015, with the presence of PG279Q and XB271HU (which obviously still have flaws of their own, but are first to deliver good looking picture on a fast refresh screen. GDM-FW900 was priced at 2500$, but if you count 10 years, it seems FAIR.

But, will this monitor last 10 years? honestly, I don't think so... it may be "overtaken" by cheaper OLEDs (provided they finally flood the market). In 5 years, it may feel like NVIDIA TITAN does feel after that period - obsolete.


----------



## l88bastar

Quote:


> Originally Posted by *Darkwizzie*
> 
> Emphasis on 11 years later.


I don't know why people keep listing the 55" OLED 60HZ TV as an alternate to this. I have two C6 OLEDs and while they are wonderful, they are not in the same Galaxy as what this 144hz Fald will do.

And yes....figuring in inflation and COLA....11 years ago $2,199 was the equivalent of 1.4 million Iraqi Dinars of todays money.....so I would say two grand US dollars is pretty cheap compared to 1.4 million Dinars


----------



## l88bastar

Quote:


> Originally Posted by *Astreon*
> 
> By the way guys: while I still think that 2000$ is fair FOR NOW, it may no longer be fair after a year or two.
> 
> Let's put it this way: GDM FW900 was basically unrivalled for 10 years, and the first competition came in 2015, with the presence of PG279Q and XB271HU (which obviously still have flaws of their own, but are first to deliver good looking picture on a fast refresh screen. GDM-FW900 was priced at 2500$, but if you count 10 years, it seems FAIR.
> 
> But, will this monitor last 10 years? honestly, I don't think so... it may be "overtaken" by cheaper OLEDs (provided they finally flood the market). In 5 years, it may feel like NVIDIA TITAN does feel after that period - obsolete.


That was a different time and place, for the world and tech. This $2,000 FALD will be half price in two years and completely irrelevant in 5 years.


----------



## Astreon

If OLED surfaces on gaming screens, possibly.

But since there's nothing "better" possible (at least for LCD), I'm sure that it will remain expensive for a long time.

Basically, nothing other than OLED can challenge this. Only AUO knows how to make fast IPS displays.


----------



## boredgunner

Quote:


> Originally Posted by *Astreon*
> 
> If OLED surfaces on gaming screens, possibly.
> 
> But since there's nothing "better" possible (at least for LCD), I'm sure that it will remain expensive for a long time.
> 
> Basically, nothing other than OLED can challenge this. Only AUO knows how to make fast IPS displays.


Well, there is that prototype Panasonic LCD technology that puts a light modulating cell of sorts in each subpixel, making it similar to self-emissive pixels like OLED. That's experimental, but other than that they can always add more dimming zones which would help greatly (as would a glossy coating). Sony's Z series TV has 620 if I recall correctly.


----------



## Sedolf

Quote:


> Originally Posted by *boredgunner*
> 
> Sony's Z series TV has 620 if I recall correctly.


Sony ZD9

65'' ~646 zones
75'' ~848 zones
100'' ~1508 zones


----------



## Jbravo33

Quote:


> Originally Posted by *Hawk777th*
> 
> Wish they could make something you could put on your desk that doesnt scream GAMER EXTREME! I dont want a distracting red light on my monitor. I know you can turn it off but still.
> 
> Should have been larger but sure price prevented that.
> 
> Waiting for 144Hz 21:9.


what I'm waiting for. wasn't thrilled with omen or Samsung cf34. this 4k grabs my attention but 27 inches sucks. and food for thought I just benched my Titan Xp's underwater stock clocks. 140 FPS 3440x1440 and 190 FPS in SLI. cards are monsters they will easily push this asus swift over 100 fps 4k


----------



## Malinkadink

in 2004 Sony released the mother of CRT monitors at FW900 @ $2300 MSRP, anyone that bought it back then and kept it to this day assuming it lived this long (unlikely, but repairable) got more than their moneys worth. Anyone that buys this monitor and keeps it for 10+ years will also get their moneys worth. The only downside is this isn't OLED, and i'd feel some buyers remorse if an OLED variant came out in 1-2 years.


----------



## dboythagr8

Quote:


> Originally Posted by *Astreon*
> 
> If you care for some insider info: yes, Acer gets better panels than Asus from AU Optronics.


Wait, what? Why is this? Isn't Asus viewed as the more "premium" brand?


----------



## boredgunner

Quote:


> Originally Posted by *dboythagr8*
> 
> Wait, what? Why is this? Isn't Asus viewed as the more "premium" brand?


I believe AUO is owned by Acer, or something to that extent.


----------



## BoredErica

Quote:


> Originally Posted by *l88bastar*
> 
> I don't know why people keep listing the 55" OLED 60HZ TV as an alternate to this. I have two C6 OLEDs and while they are wonderful, they are not in the same Galaxy as what this 144hz Fald will do.
> 
> And yes....figuring in inflation and COLA....11 years ago $2,199 was the equivalent of 1.4 million Iraqi Dinars of todays money.....so I would say two grand US dollars is pretty cheap compared to 1.4 million Dinars


Not just inflation... bigger factor isn't inflation, it's that technology has had 11 years to improve. I don't even need to think about inflation when comparing 1050ti to GTX 8800 Ultra to see how far we've come even though they are nowhere near at the same price ranges.

LCD with FALD's not beating OLED in my eyes, even if Asus gets its act together. The real deal is losing some visuals for 144hz. My 2 cents.

Well, and the input lag. That's a thing too though.


----------



## KGPrime

Fw900 released in 2000. I bought my first one in 2004, and second one in 2006 used. First one lasted until 2013, the other 2016. Looking back i would have payed 5 thousand dollars easily for them to avoid LCDs all these years.

This monitor price is not unexpected for what it is. But for $2k it should at least include hardware calibration support imo. In fact i wish they'd just add that and it cost $2500. Since the Eizo "equivalent" to this monitor is about $2700 dollars https://www.bhphotovideo.com/c/product/1138068-REG/eizo_cg248_4k_23_8_hardware_calibration_lcd.html and is just edge lit 10 bit at 60Hz with polariser and built in hardware calibration, and is only 24". Want True 4k at 31" ? 6 Thousand Dollars. No 144Hz, no gsync, fald, nor possible ULMB. Previously these were my main choices for an LCD. But 60Hz. for 2700 dollars. or 2500 for the 1440p is a hard pill to swallow.


----------



## dboythagr8

Quote:


> Originally Posted by *boredgunner*
> 
> I believe AUO is owned by Acer, or something to that extent.


literally...breaking news for me.


----------



## l88bastar

Quote:


> Originally Posted by *Darkwizzie*
> 
> Not just inflation... bigger factor isn't inflation, it's that technology has had 11 years to improve. I don't even need to think about inflation when comparing 1050ti to GTX 8800 Ultra to see how far we've come even though they are nowhere near at the same price ranges.
> 
> LCD with FALD's not beating OLED in my eyes, even if Asus gets its act together. The real deal is losing some visuals for 144hz. My 2 cents.
> 
> Well, and the input lag. That's a thing too though.


Obviously I love OLED...but there is nothing...and I mean NOTHING OLED PC display wise on the horizon. The 120hz Dell 4k Oled was literally our only hope of seeing an OLED PC display anytime soon and now that its cancelled it is looking like OLED is alot like SED to me.....I don't think we are gonna see OLED PC monitors for several years now, if at all


----------



## Astreon

Quote:


> Originally Posted by *dboythagr8*
> 
> literally...breaking news for me.


it's not a secret, it's stated on wikipedia that AUO was formed by merger of *Acer Display Technology* and another firm.

And yes, Acer gets better panels than any other manufacturer buying from them (this means Asus and AOC).

GDM-FW900 was amazing and it was literally the best PC monitor all the way up to 2015, like I said, so if you kept one for 15 years that 2300$ was worth it, literally every cent. But in the case of PG27UQ, I'm afraid that in 2-3 years it will be already obsoleted by OLED displays.


----------



## Aristotelian

On OLED rendering this monitor obsolete:

The only OLED monitor that was announced was the Dell one at USD 4999. This monitor will be USD 2000 or so. I would expect a monitor that is more than double the price of this one to be better, hands down, yet this 2000 USD monitor will suffice, depending on the actual performance.

I wouldn't count this monitor out yet - there are enough people in this thread (you included) stating that USD 2000 is too much, or is a joke - would these people have spent USD 5000 on the Dell offering? I don't see how a product costing 2.5 times what this does would render this monitor obsolete.

I have yet to see an OLED monitor announced - Dell recalled their announcement. So, this one still sits at king of the hill for a gamer - or am I wrong?


----------



## Astreon

Don't forget that Dell announced their 120hz OLED a while ago and it seemed a failed experiment. Nobody (except for Vega







) would buy such thing, I guess. The OLED manufacturing process can be cheaper than LCD's, there is ZERO reason - except for marketing and hype - to make such a monitor cost so much.

I have stated that 2000 is too much for me, but I also stated that I feel it's a reasonable price after giving it some thought - FALD is very expensive, you won't get a cheap FALD monitor, simply put. I think that actually FALD LCD costs much more than OLED to manufacture.

IIRC it's something like that:



in 2013-2014 it was much worse but IIRC it's no longer valid after LG perfected their manufacturing process.

2013-2014:


----------



## Aristotelian

I do not forget, because I was following the Dell release as closely as this one for my upcoming build (that has been 'upcoming' for a while since I'm waiting on a monitor and then will build the PC around that).

I would have bought the Dell for precisely the same reason as I am looking closely at this monitor - it's meant to be a premium product in this category and offer me the best gaming experience. I don't post enough on this forum so my penchant for premium products is not (yet) known but it will be once this new build is done.

I saw your previous posts about value/salary etc. issues and to be honest the 'worth it' discussion only functions (for me) if there's a substitute good to consider. From your posts you are now implying that Acer will have better panels and, I suppose, their monitor may be better? Both Acer and Asus announced their 4k/hdr/120hz/fald displays at the same time, so reviews will help me decide which to get. Other than that I don't see what other substitute good has been announced - for those that would consider this monitor at this price point.


----------



## BoredErica

Quote:


> Originally Posted by *Aristotelian*
> 
> I do not forget, because I was following the Dell release as closely as this one for my upcoming build (that has been 'upcoming' for a while since I'm waiting on a monitor and then will build the PC around that).
> 
> I would have bought the Dell for precisely the same reason as I am looking closely at this monitor - it's meant to be a premium product in this category and offer me the best gaming experience. I don't post enough on this forum so my penchant for premium products is not (yet) known but it will be once this new build is done.
> 
> I saw your previous posts about value/salary etc. issues and to be honest the 'worth it' discussion only functions (for me) if there's a substitute good to consider. From your posts you are now implying that Acer will have better panels and, I suppose, their monitor may be better? Both Acer and Asus announced their 4k/hdr/120hz/fald displays at the same time, so reviews will help me decide which to get. Other than that I don't see what other substitute good has been announced - for those that would consider this monitor at this price point.


Oh. Didn't realize Acer's will have FALD too.


----------



## Aristotelian

Quote:


> Originally Posted by *Darkwizzie*
> 
> Oh. Didn't realize Acer's will have FALD too.


http://www.overclock.net/t/1620164/acer-predator-xb272-hdr-and-asus-rog-swift-pg27uq-4k-27-144hz-ips-hdr-qd-g-sync-monitors

They are meant to have the same specs as far as I can tell, so if there's quality differences in the panels (both should be using the same one?) this may render one better than the other.


----------



## Astreon

Acer
Quote:


> Originally Posted by *Aristotelian*
> 
> I do not forget, because I was following the Dell release as closely as this one for my upcoming build (that has been 'upcoming' for a while since I'm waiting on a monitor and then will build the PC around that).
> 
> I would have bought the Dell for precisely the same reason as I am looking closely at this monitor - it's meant to be a premium product in this category and offer me the best gaming experience. I don't post enough on this forum so my penchant for premium products is not (yet) known but it will be once this new build is done.
> 
> I saw your previous posts about value/salary etc. issues and to be honest the 'worth it' discussion only functions (for me) if there's a substitute good to consider. From your posts you are now implying that Acer will have better panels and, I suppose, their monitor may be better? Both Acer and Asus announced their 4k/hdr/120hz/fald displays at the same time, so reviews will help me decide which to get. Other than that I don't see what other substitute good has been announced - for those that would consider this monitor at this price point.


Well, let's not forget that I'm a poor Eastern Europe pleb, I earn 800 USD per month. So yes, this is beyond my caliber. I always assumed that I should not buy things that are more expensive than my monthly income, heh. For rich guys, this is entirely different matter, I guess. I'm not sure if 2000$ is "that much" to an American/ Perhaps it is, but isn't 2000$ a reasonable monthly salary for a Master's degree bearer in Telecommunications in US?

That aside, I'm not sure if the trend will continue, but so far Asus and AOC were receiving slightly worse panels on average than Acer - for the 144hz IPS monitors. This does not mean they were always getting junk panels, but the asus pg279q return rate is higher for precisely that reason.


----------



## CallsignVega

Quote:


> Originally Posted by *Astreon*
> 
> Acer
> Well, let's not forget that I'm a poor Eastern Europe pleb, I earn 800 USD per month. So yes, this is beyond my caliber. I always assumed that I should not buy things that are more expensive than my monthly income, heh. For rich guys, this is entirely different matter, I guess. I'm not sure if 2000$ is "that much" to an American/ Perhaps it is, but isn't 2000$ a reasonable monthly salary for a Master's degree bearer in Telecommunications in US?
> 
> That aside, I'm not sure if the trend will continue, but so far Asus and AOC were receiving slightly worse panels on average than Acer - for the 144hz IPS monitors. This does not mean they were always getting junk panels, but the asus pg279q return rate is higher for precisely that reason.


Telecommunications is about an average of $80,000 per year, or just under $7,000 per month.

http://www.payscale.com/research/US/Industry=Telecommunications/Salary

A decent wage, but far from being "rich". I would reserve that term for someone with over a million in the bank.


----------



## Astreon

7000$ a month... Geeeeee. I wish Mr. Trump would actually do as promised and allow Polish citizens to visit US without a Visa.









My 800$ (WHICH IS A GOOD SALARY IN PL, TRUST ME) seems like a joke in comparison.


----------



## CallsignVega

But isn't the cost of living there pretty cheap?







In my area rent alone is ~$2,500/month.


----------



## Astreon

well, you live in a house, right?









I'd cost me around 1000$ per month to live in a house in a decently-located place (near the city I work in). Flat would be cheaper, still upwards of 500$ per month. Of course without bills (electricity, internet, water, etc.).


----------



## DrFreeman35

Quote:


> Originally Posted by *Astreon*
> 
> well, you live in a house, right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd cost me around 1000$ per month to live in a house in a decently-located place (near the city I work in). Flat would be cheaper, still upwards of 500$ per month. Of course without bills (electricity, internet, water, etc.).


That is some crazy cost of living, considering $800/month is a good salary. Here in Texas you are looking at $750/month (on the low end, might not be good area ) - $2k/month ( new house/nice area ). It is interesting to see, but should know Washington is rather pricey to live, as is places like California. In saying that, you can find houses at crazy cheap prices or ridiculously expensive anywhere you live here in the states. I am in Dallas, TX and you will find plenty of houses that I would assume most do not pay monthly for. I think for something like this monitor to be justifiable depends all on your reasons of use. For someone to have an opinion on this monitor, they need to understand everyone's circumstances are different. It is not going to make sense for some, but it is nice to see options and updates coming to the market. IMO


----------



## mmms

Quote:


> Originally Posted by *Astreon*
> 
> well, you live in a house, right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd cost me around 1000$ per month to live in a house in a decently-located place (near the city I work in). Flat would be cheaper, still upwards of 500$ per month. Of course without bills (electricity, internet, water, etc.).


We live in egypt with 2000-3000 EGP monthly , Almost $250-$300 . This situation was until 2013 when $1 was = 10 EGP.

Now the situation has changed , After our new failing president Abdelfattah el-sisi who made $1 = 20 EGP and salaries as they are . LOL .

So if this Asus or Acer really will cost $2000 which means 20.000 EGP until 2013 , Now I need the double 40.000 EGP . LOL .
So if i want to buy this monitor , i should wait at least another 4 years .


----------



## dboythagr8

Just want to say the last few posts have been pretty interesting as fair as world wide pay goes. Puts things in perspective and certainly the price of this monitor.


----------



## Aristotelian

Well, I earn about five figures net a month, but have a mortgage per month that is more than this monitor. My wife is supportive of my hobbies but, as a compromise, we drive around in a 20 year old car that gets us from A to B instead of the AMG SLS I wanted. My marriage is more of a limit on what I spend because my wife is kind of a hippie whereas I'm a Savile Row + Hermes tie kind of guy.

The balance works for us. I personally wish that the "premium" of everything was widely available because, for me, x's being premium speaks of the property of x itself, and not its 'scarcity' or 'rarity'. I wish some of the three michelin star restaurants I go to would do open days, and if I had enough money for that I might do that as a 'charity' and pick some foodies who've dreamt about a restaurant and book it out for them to try. There's a place in rural Slovenia I want to try.

PS: living in Eastern Europe has its perks - if I was a single guy I'd romp around in Prague as an appetizer.

Basically, I'm a rational hedonist. I encourage everyone to enjoy their pleasures. It can be a wonderful life for us, who get to sit here debating the qualities of a computer monitor while the vast majority of this planet is far less fortunate than we are, irrespective of our monthly earnings.


----------



## boredgunner

Quote:


> Originally Posted by *Aristotelian*
> 
> I have yet to see an OLED monitor announced - Dell recalled their announcement. So, this one still sits at king of the hill for a gamer - or am I wrong?


Monitors are usually announced less than a year in advance. I'm not expecting OLED consumer monitors until 2019, but that's just a guess.


----------



## mouacyk

2x1080 Ti + 1440p 165Hz G-Sync and ULMB = $2050


----------



## ZealotKi11er

Quote:


> Originally Posted by *boredgunner*
> 
> Monitors are usually announced less than a year in advance. I'm not expecting OLED consumer monitors until 2019, but that's just a guess.


Well Dell OLED was first announced in CES 2016. Nothing for 2017 so we kind of went back 2 years. Even is we see in 2018 by the time we can afford one it will be 2020.


----------



## boredgunner

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Well Dell OLED was first announced in CES 2016. Nothing for 2017 so we kind of went back 2 years. Even is we see in 2018 by the time we can afford one it will be 2020.


It was officially canned in 2016.


----------



## Tonza

Tried to browse pages back, but where has this 2000 dollar price tag been posted?. Starts to be very expensive "gaming" stuff, also seeing how hilarously bad quality control these "gaming" monitors usually have, i would not expect nothing else than 100% perfect panel for this price (which will not be the case for sure).


----------



## boredgunner

Quote:


> Originally Posted by *Tonza*
> 
> Tried to browse pages back, but where has this 2000 dollar price tag been posted?. Starts to be very expensive "gaming" stuff, also seeing how hilarously bad quality control these "gaming" monitors usually have, i would not expect nothing else than 100% perfect panel for this price (which will not be the case for sure).


It has been mentioned in video overviews, and ASUS listed $2000 in a press release a while ago I believe. I hope the quality control is not horrible, but I'm not optimistic so I'm expecting the worst. New technology = new problems.


----------



## meowth2

i'm done with buying monitors until oled comes. no way i'm buying ips crap


----------



## l88bastar

Quote:


> Originally Posted by *meowth2*
> 
> i'm done with buying monitors until oled comes. no way i'm buying ips crap


When is OLED coming?

What other 4k 144hz options are coming out?

Oh right, NEVER & NONE


----------



## boredgunner

Quote:


> Originally Posted by *l88bastar*
> 
> When is OLED coming?
> 
> What other 4k 144hz options are coming out?
> 
> Oh right, NEVER & NONE


Never say never. Since OLED costs less to make than FALD LCD (and maybe LCD in general, if not yet then soon), that strongly suggests OLED will actually phase out LCD. Less costly and better in pretty much every way.

There are other 4k 144 Hz monitors on the horizon, the Acer XB272-HDR and an AOC 32" variant.


----------



## BoredErica

Quote:


> Originally Posted by *l88bastar*
> 
> When is OLED coming?
> 
> What other 4k 144hz options are coming out?
> 
> Oh right, NEVER & NONE


Your crystal ball is OP... maybe you should give me the next lottery numbers too.

...Dude, just say 'not in the forseeable future'.


----------



## ZealotKi11er

Quote:


> Originally Posted by *meowth2*
> 
> i'm done with buying monitors until oled comes. no way i'm buying ips crap


It does not have to be OLED. They already have QLED from Samsung which are very good. OLED problem is that it was never designed for gaming and PC. It will have to saturate TV markets before it hits PC in masses. I personally not interested on 27" screen even if the panel is OLED and price $1K.


----------



## boredgunner

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It does not have to be OLED. They already have QLED from Samsung which are very good. OLED problem is that it was never designed for gaming and PC. It will have to saturate TV markets before it hits PC in masses. I personally not interested on 27" screen even if the panel is OLED and price $1K.


That QLED is just quantum dot VA, not much different than their latest high refresh rate monitors (higher contrast is all). In fact, their 2016 model TVs is better than their 2017 lineup. Either way they only use an edge mounted backlight (mounted on the bottom of the screen) so the local dimming is pure crap, so this monitor should actually look better.


----------



## ZealotKi11er

Quote:


> Originally Posted by *boredgunner*
> 
> That QLED is just quantum dot VA, not much different than their latest high refresh rate monitors (higher contrast is all). In fact, their 2016 model TVs is better than their 2017 lineup. Either way they only use an edge mounted backlight (mounted on the bottom of the screen) so the local dimming is pure crap, so this monitor should actually look better.


How did they achieve HDR?


----------



## boredgunner

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How did they achieve HDR?


You do not have to meet HDR10's suggested requirements to have an HDR display. They achieved HDR by making their TVs able to process HDR content.


----------



## ZealotKi11er

Quote:


> Originally Posted by *boredgunner*
> 
> You do not have to meet HDR10's suggested requirements to have an HDR display. They achieved HDR by making their TVs able to process HDR content.


This is going to be a nightmare. How long before we get legid Monitors with full HDR support..


----------



## boredgunner

Quote:


> Originally Posted by *ZealotKi11er*
> 
> This is going to be a nightmare. How long before we get legid Monitors with full HDR support..


We're going to need HDMI 2.1 if we want it at high refresh rates. DisplayPort 1.4 can only do 4k + 120 Hz + 8-bit color depth and only SDR, without compression. Although it can probably do that with 10-bit color and HDR, when using DSC compression (which this monitor uses anyway). However I think HDR video games are still only created with sRGB color space, so the games are a limiting factor too.


----------



## EniGma1987

Quote:


> Originally Posted by *ZealotKi11er*
> 
> This is going to be a nightmare. How long before we get legid Monitors with full HDR support..


Technically:

Quote:


> A TV, monitor or projector may be referred to as a HDR Compatible Display if it meets the following minimum attributes:
> 
> Includes at least one interface that supports HDR signaling as defined in CEA-861-F, as extended by CEA-861.3.
> Receives and processes static HDR metadata compliant with CEA-861.3 for uncompressed video.
> Receives and processes HDR10 Media Profile* from IP, HDMI or other video delivery sources. Additionally, other media profiles may be supported.
> Applies an appropriate Electro-Optical Transfer Function (EOTF), before rendering the image.
> 
> * Note: HDR10 Media Profile is defined as:
> 
> EOTF: SMPTE ST 2084
> Color Sub-sampling: 4:2:0 (for compressed video sources)
> Bit Depth: 10 bit
> Color Primaries: ITU-R BT.2020
> Metadata: SMPTE ST 2086, MaxFALL, MaxCLL


The minimums for other specs are 4k, wide color support, and 10-bit color. They can be higher, but those are the minimums for both HDR10 and Dolby Vision specs. And my quoted text above is the signal requirements for HDR10, which is much lower than Dolby Vision HDR.
The minimum requirements to list as HDR10 compatible doesnt require any of the necessary brightness nit levels and contrast to actually display the HDR content properly or to its full levels, the TV only needs to accept and process the signals really. I believe that to truly support the Dolby Vision HDR requirements the TV needs to be OLED, as that is currently the only way to get the required black levels, and the required white levels and brightness to meet the target spec.


----------



## ToTheSun!

Quote:


> Originally Posted by *boredgunner*
> 
> Never say never. Since OLED costs less to make than FALD LCD (and maybe LCD in general, if not yet then soon)


I've been seeing that said on OCN for the past couple of days. How are you guys arriving at that conclusion? Are you basing that on actual data? It looks to me most of the premium for FALD displays does not derive from production costs, but from product segmentation.


----------



## boredgunner

Quote:


> Originally Posted by *ToTheSun!*
> 
> I've been seeing that said on OCN for the past couple of days. How are you guys arriving at that conclusion? Are you basing that on actual data? It looks to me most of the premium for FALD displays does not derive from production costs, but from product segmentation.


Well, OLED should have lower material costs since it's less complex than LCD (especially FALD LCD), and OLED prices continuously go down while its marketshare increases (primarily in the smartphone industry though). I think that's all we have to go off of.


----------



## EniGma1987

Quote:


> Originally Posted by *ToTheSun!*
> 
> I've been seeing that said on OCN for the past couple of days. How are you guys arriving at that conclusion? Are you basing that on actual data? It looks to me most of the premium for FALD displays does not derive from production costs, but from product segmentation.


Depends how much of full array backlighting is used and whether local dimming is used or not. I believe most people are mistaking full array costs with local dimming costs. Manufacturers can put only 32 big LEDs for the backlight if they want and no local dimming and it would be cheap. But as soon as you start adding meaningful local dimming, costs rise drastically. To have 256+ zones, you need at least 256+ LEDs. On top of that you need a controller chip (or chips) with 256 drive channels, which can get pricey. The top displays have 384 zones now, which means chips connected to the main processor to drive 384 completely different and separate channels, and 384 LEDs. That's a lot of money on the BOM. But simply full array and nothing else? Pretty cheap most of the time.


----------



## ToTheSun!

Quote:


> Originally Posted by *EniGma1987*
> 
> The top displays have 384 zones now, which means chips connected to the main processor to drive 384 completely different and separate channels, and 384 LEDs. That's a lot of money on the BOM.


I suppose. I just don't know if that's still more expensive than 8+ million individual OLED's.


----------



## t1337dude

Quote:


> Originally Posted by *boredgunner*
> 
> Never say never. Since OLED costs less to make than FALD LCD (and maybe LCD in general, if not yet then soon), that strongly suggests OLED will actually phase out LCD. Less costly and better in pretty much every way.
> 
> There are other 4k 144 Hz monitors on the horizon, the Acer XB272-HDR and an AOC 32" variant.


I love people's obsession with the difference between a monitor and a TV. Monitors are obviously more suitable for the common PC situation but in my mind that's more for desk-work. As far as gaming goes - I just can't figure out the affinity towards hunching over a desk, looking at a small screen. I'd rather look at a big beautiful 65" OLED screen to play just about any game than some dinky 27" monitor or some awkward ultra-wide setup. If I bring guests over, they're always more impressed with a couch-TV setup. 144Hz + GSync looks nice...but compared to a giant screen and infinite contrast, it's a relatively weak draw.

People restricting themselves to monitors are going to be ages behind in IQ for quite some time


----------



## boredgunner

Quote:


> Originally Posted by *t1337dude*
> 
> I love people's obsession with the difference between a monitor and a TV. Monitors are obviously more suitable for the common PC situation but in my mind that's more for desk-work. As far as gaming goes - I just can't figure out the affinity towards hunching over a desk, looking at a small screen. I'd rather look at a big beautiful 65" OLED screen to play just about any game than some dinky 27" monitor or some awkward ultra-wide setup. If I bring guests over, they're always more impressed with a couch-TV setup. 144Hz + GSync looks nice...but compared to a giant screen and infinite contrast, it's a relatively weak draw.
> 
> People restricting themselves to monitors are going to be ages behind in IQ for quite some time


The primary factors for me preferring a monitor are input lag (TVs almost always have too much), PPI (I'm going to need at least 5k resolution before I have no issues with PPI at 50" and above), variable refresh rate and high refresh rate. I can't stand tearing and high refresh rate is a huge bonus for me (although not as important as image quality, generally speaking).

But you're right that monitors will remain far behind in picture quality until OLED finally surfaces in the consumer monitor industry. Hopefully OLED monitors do away with matte coatings...


----------



## MattEnth

Is there any timeframe for release right now?


----------



## aberrero

Quote:


> Originally Posted by *boredgunner*
> 
> It was officially canned in 2016.


It was actually released today.


----------



## l88bastar

Quote:


> Originally Posted by *aberrero*
> 
> It was actually released today.


Indeed...now all the picky pants who have been complaining about IPS & the smallish 27" and wanted the "cheaper to produce" OLED tech can opt of from this puny 27" FALD and go with the more economical $3,500 Dell 30" 60hz OLED


----------



## cskippy

Oh man, I thought you meant the Asus! I went looking all around for 10 minutes on the internet and was like *** is this guy smoking...

Dell....hahahahaha yeah...I'll pass.


----------



## aberrero

Quote:


> Originally Posted by *l88bastar*
> 
> Indeed...now all the picky pants who have been complaining about IPS & the smallish 27" and wanted the "cheaper to produce" OLED tech can opt of from this puny 27" FALD and go with the more economical $3,500 Dell 30" 60hz OLED


I bought my 55" OLED for $1400 though. Prices will come down. It just needs a bit more time.


----------



## Aristotelian

Quote:


> Originally Posted by *l88bastar*
> 
> Indeed...now all the picky pants who have been complaining about IPS & the smallish 27" and wanted the "cheaper to produce" OLED tech can opt of from this puny 27" FALD and go with the more economical $3,500 Dell 30" 60hz OLED


This - exactly this. And I think that with the 60Hz the Dell is aimed at productivity tasks whereas, well, I'm hopeful for this Asus being a quality monitor because I do enjoy a game or two....


----------



## l88bastar

Quote:


> Originally Posted by *aberrero*
> 
> I bought my 55" OLED for $1400 though. Prices will come down. It just needs a bit more time.


Ja Ja....I have two 55" OLED C6 and they are amazing


----------



## aberrero

Quote:


> Originally Posted by *l88bastar*
> 
> Ja Ja....I have two 55" OLED C6 and they are amazing


Honestly, I think the 3D on OLED is even more impressive than 4k. Have you watched Titanic or Avatar in 3D? It's just mind bending. Looking forward to getting Rogue One.


----------



## animeowns

Quote:


> Originally Posted by *aberrero*
> 
> It was actually released today.


you mean the Asus PG27UQ where is it so I can purchase it now ?


----------



## boredgunner

Quote:


> Originally Posted by *animeowns*
> 
> you mean the Asus PG27UQ where is it so I can purchase it now ?


He means the Dell UP3017Q. We have a thread on it on the Monitors section. 30" 4k 60 Hz OLED monitor.


----------



## animeowns

Quote:


> Originally Posted by *boredgunner*
> 
> He means the Dell UP3017Q. We have a thread on it on the Monitors section. 30" 4k 60 Hz OLED monitor.


oh its not 120hz not worth the price


----------



## Clukos

Quote:


> Originally Posted by *animeowns*
> 
> oh its not 120hz not worth the price


That monitor is not aimed for gamers... It's the absolute professional monitor, it covers 100% of Adobe RGB, 97.8% DCI-P3 and 85.8% Rec2020 color space. It probably costs less to manufacture than the Asus too. Plus you can get it today and not sometime in Q3


----------



## Rmerwede

Serious question: What is supposed to power games to 4K @ 144hz?

GPUs used to be held back by the lack of decent displays, now it is the opposite.


----------



## boredgunner

Quote:


> Originally Posted by *Rmerwede*
> 
> Serious question: What is supposed to power games to 4K @ 144hz?
> 
> GPUs used to be held back by the lack of decent displays, now it is the opposite.


GTX 1080 Ti SLI should yield you at least 100 FPS average in modern AAA games that support SLI. It's not 144 FPS, but with G-SYNC you'll get a tear free experience at high frame rates nonetheless, plus full array local dimming with presumably > 20,000:1 contrast per dimming zone and HDR when applicable.

And of course, many older games will run at 144 FPS on just one GTX 1080 Ti or even just a GTX 1080. Lastly, this monitor's release date might not be far off from Volta.


----------



## animeowns

Quote:


> Originally Posted by *boredgunner*
> 
> GTX 1080 Ti SLI should yield you at least 100 FPS average in modern AAA games that support SLI. It's not 144 FPS, but with G-SYNC you'll get a tear free experience at high frame rates nonetheless, plus full array local dimming with presumably > 20,000:1 contrast per dimming zone and HDR when applicable.
> 
> And of course, many older games will run at 144 FPS on just one GTX 1080 Ti or even just a GTX 1080. Lastly, this monitor's release date might not be far off from Volta.


1080ti in 3 way sli or 4 way can get you 80+fps at 8k in BF1


----------



## Astreon

Quote:


> Originally Posted by *boredgunner*
> 
> GTX 1080 Ti SLI should yield you at least 100 FPS average in modern AAA games that support SLI. It's not 144 FPS, but with G-SYNC you'll get a tear free experience at high frame rates nonetheless, plus full array local dimming with presumably > 20,000:1 contrast per dimming zone and HDR when applicable.
> 
> And of course, many older games will run at 144 FPS on just one GTX 1080 Ti or even just a GTX 1080. Lastly, this monitor's release date might not be far off from Volta.


also, if the monitor scales without interpolation, you can play loseless 1080p 144hz, albeit 27inch 1080p isn't very pretty.


----------



## FearlessBelgian

20.000:1 , its only on HDR or on SDR too ?

It is rather strange this contrast because even the televisions in full led did not manage to exceed the 1000:1 ratio ... ( with IPS panel )

A new non-announced technology or this is a marketing bull**** ?


----------



## Asmodian

That is due to the local dimming, if you were to measure the non-dimming contrast it is probably around 1000:1, as you would expect from a decent IPS panel.


----------



## guttheslayer

So this confirm PG27UQ support Display Stream Compression (DSC) right?

Since 10 bits colour 4K 144Hz HDR is only achievable with DSC in DP 1.4?


----------



## boredgunner

Quote:


> Originally Posted by *guttheslayer*
> 
> So this confirm PG27UQ support Display Stream Compression (DSC) right?
> 
> Since 10 bits colour 4K 144Hz HDR is only achievable with DSC in DP 1.4?


Correct. I wonder just how lossless this "lossless compression" really is, and I wonder if it shuts off if you switch it to 8-bit 120 Hz and below, that way we can do direct comparisons.


----------



## bee144

Did anyone see Dell's 27" 4K HDR monitor? 1,999 USD and doesn't have dynamic zone backlight, g-sync, only 60 hz, only 97% DCI-P3, no quantum dot.

Makes you wonder about the pricing.


----------



## CallsignVega

97% DCI-P3 is actually really good. I am like 90% certain that this is the same 27" 4K IPS AUO panel used in the upcoming 144 Hz 4K FALD displays. To get 1000-nit I am sure it has FALD, it just isn't advertised. Not to mention the price infers that is has FALD. Also the color capabilities infer that it has QD, once again not advertised. Dell doesn't advertise every single technical detail in their press releases.


----------



## djfunz

Quote:


> Originally Posted by *bee144*
> 
> Did anyone see Dell's 27" 4K HDR monitor? 1,999 USD and doesn't have dynamic zone backlight, g-sync, only 60 hz, only 97% DCI-P3, no quantum dot.
> 
> Makes you wonder about the pricing.


I was wondering about this too. Could just be an early adopters charge since there's no competition yet. The Acer and ASUS monitors could be months away from release still. Nobody knows for sure.


----------



## djfunz

Quote:


> Originally Posted by *bee144*
> 
> Did anyone see Dell's 27" 4K HDR monitor? 1,999 USD and doesn't have dynamic zone backlight, g-sync, only 60 hz, only 97% DCI-P3, no quantum dot.
> 
> Makes you wonder about the pricing.


I was wondering about this too. Could just be an early adopters price since there's no competition yet. The Acer and ASUS monitors could be months away from release still. Nobody knows for sure.


----------



## Jbravo33

60Hz for 2K im hoping its more than that.


----------



## cskippy

Just get the Asus or Acer 144Hz HDR Gsync instead.


----------



## ih2try

my ideal monitor would be a 27", WQHD, HDR, G-Sync, 165Hz, IPS and cost $699.
if it's 4k it should be at least 32", or even 40" !


----------



## Kommanche

Quote:


> Originally Posted by *ih2try*
> 
> my ideal monitor would be a 27", WQHD, HDR, G-Sync, 165Hz, IPS and cost $699.
> *if it's 4k it should be at least 32", or even 40*" !


Disagree! Moving from 1440p 27" to 4K at 28" the difference in IQ is massive, and there's definitely scope for more pixels at the same screen size.


----------



## animeowns

Quote:


> Originally Posted by *Kommanche*
> 
> Disagree! Moving from 1440p 27" to 4K at 28" the difference in IQ is massive, and there's definitely scope for more pixels at the same screen size.


Have to agree with you there but moving to 5k I did not see the difference as much since the screen is so small the perfect size for a 4k and up screen would be 40' inch or better but keep the input lag low that would be the ultimate screen a oled 144hz 4k 5k or 8k gsync + HDR quantom Dot


----------



## SightUp

I am confused on something. People are saying that this monitor will not have back light bleeding. How so? It's an IPS monitor right? Doesn't all monitors that are IPS by default have back light bleeding?


----------



## t1337dude

Quote:


> Originally Posted by *Kommanche*
> 
> Disagree! Moving from 1440p 27" to 4K at 28" the difference in IQ is massive, and there's definitely scope for more pixels at the same screen size.


I really think size > PPI in terms of immersion. If high PPI was more immersive, nobody would buy big screen TVs







PPI is nice for IQ but it's more icing on the cake in the grand scope of things, rather than the cake itself.

I understand it's a preference but I almost get a sense that preference is dictated by a lack of space or money rather than some genuine belief that more PPI offers a better "experience" than simply having a larger screen.


----------



## l88bastar

Quote:


> Originally Posted by *t1337dude*
> 
> I really think size > PPI in terms of immersion. If high PPI was more immersive, nobody would buy big screen TVs
> 
> 
> 
> 
> 
> 
> 
> PPI is nice for IQ but it's more icing on the cake in the grand scope of things, rather than the cake itself.
> 
> I understand it's a preference but I almost get a sense that preference is dictated by a lack of space or money rather than some genuine belief that more PPI offers a better "experience" than simply having a larger screen.


I have a 55" OLED and the immersion is incredible sitting about 4' away. I prefer big displays, but at this point I will take whatever 4k 120+ is out there. Display manufactures have to play it safe and go for the biggest market segment and 27" is the safest and most tried and true modern display size for gamers as its not to big and not to small.

Hopefully as the market gets saturated with 27"s we will see them mix it up with 32"s and maybe a real good high refresh OLED ta boot.


----------



## Kommanche

Quote:


> Originally Posted by *l88bastar*
> 
> I have a 55" OLED and the immersion is incredible sitting about 4' away. I prefer big displays, but at this point I will take whatever 4k 120+ is out there. Display manufactures have to play it safe and go for the biggest market segment and 27" is the safest and most tried and true modern display size for gamers as its not to big and not to small.
> 
> Hopefully as the market gets saturated with 27"s we will see them mix it up with 32"s and maybe a real good high refresh OLED ta boot.


To be honest with you, I think OLED will probably be skipped in favour of inferior, but still HDR capable, LCD panels. The fact the Dell scrapped their OLED monitor, and Samsung aren't manufacturing OLED for anything other than mobile says a lot.


----------



## profundido

Quote:


> Originally Posted by *t1337dude*
> 
> I really think size > PPI in terms of immersion. If high PPI was more immersive, nobody would buy big screen TVs
> 
> 
> 
> 
> 
> 
> 
> PPI is nice for IQ but it's more icing on the cake in the grand scope of things, rather than the cake itself.
> 
> I understand it's a preference but I almost get a sense that preference is dictated by a lack of space or money rather than some genuine belief that more PPI offers a better "experience" than simply having a larger screen.


I used to share your opinion on this, stating that PPI doesn't matter that much but after experiencing it first hand in practise I now realize Kommanche is right. PPI DOES matter up to it's saturation point, which is variable from human to human according to your sight and relative to the viewing distance. To clarify that I'll give you simple real world numbers and first hand experiences:

First of all I play the kinda of highest res content available nowadays (without that of course you wouldn't notice the difference in the first place). So this means Rise of the tomb raider, the Witcher 3, ESO, ... all at 4K with 4K textures.

First I played on a 1440p g-sync monitor and I thought it was really beautiful, the best there was at that time. Then I went up to the brand up Asus PG27AQ, which is [email protected] 27" and I immediately noticed a whole new level of immersion that I had never experienced before because the detail had become so realistic I could no longer distinguish any pixels or imperfections at my normal viewing distance of around 50cm while at 1440p @ 27" I could still dfiferentiate pixels and imperfections. This experience made me realize that I had just surpassed my personal PPI saturation point for my sitting distance for a 27K and for my personal eyesight. I remember that close ups of Lara's face in-game looked as realistic and detailed as if I was seeing her in Real life rather than a computer screen. Talk about immersion...

Then I thought bigger would be better so I moved on to a 32" 4K monitor and immediately I noticed a loss in immersion and realism as once more I could see imperfections and pixels because the PPI had gone down for the same viewing distance. In addition I had to start actively moving my head/neck which was not never necessary before with 27". After 2 hours it was bad enough for me to return the monitor and go back to my [email protected]"

So when I first saw the announcement of the new future Asus/Acer [email protected]"@120+Hz monitors last year my first reaction was: why 27 ????, why not 32"?? But after my practical experiences and tests I now understand what the engineers already knew when designing those future monitors and it makes perfect sense. Ultrarealism by reaching the PPI saturation point per viewing distance but at higher Frames and better color production. If they succeed, I don't think 2D can get any better tbh


----------



## Astreon

size is the new fad, I guess. People trade the ability to see the whole screen for "immersion". Well, if not seeing 30-40% of the screen is immersive, then count me out. To me, it's annoying and tiresome. Makes me turn my head more, and feels overwhelming









27 inch is perfect for like 80 cm distance, going by this especially:

https://en.wikipedia.org/wiki/Optimum_HDTV_viewing_distance

movies are more immersive than games and the "biggest" recommnedation is distance equal to 1.2 times the diagonal, which gives "pitifully small" screens for the majority of bigger=better people here. Just a food for thought -
Quote:


> THX recommends that the "best seat-to-screen distance" is one where the view angle approximates 40 degrees,[25] (the actual angle is 40.04 degrees).[3] Their recommendation was originally presented at the 2006 CES show, and was stated as being the theoretical maximum horizontal view angle, based on average human vision.[26] In the opinion of THX, the location where the display is viewed at a 40-degree view angle provides the most "immersive cinematic experience",[25] all else being equal. For consumer application of their recommendations, THX recommends dividing the diagonal screen measurement by .84 to calculate the optimum viewing distance, for a 1080p resolution. This equates to multiplying the diagonal measurement by about 1.2.[25]


what's exactly wrong with that reasoning that makes you bigger=better guys go for such extremely oversized screens in the first place?








no offense, just curious. Yes, I know it's for 1080p, so take it with a grain of salt.


----------



## ToTheSun!

Quote:


> Originally Posted by *Kommanche*
> 
> The fact the Dell scrapped their OLED monitor


Except they didn't. It's available now.


----------



## animeowns

Quote:


> Originally Posted by *l88bastar*
> 
> I have a 55" OLED and the immersion is incredible sitting about 4' away. I prefer big displays, but at this point I will take whatever 4k 120+ is out there. Display manufactures have to play it safe and go for the biggest market segment and 27" is the safest and most tried and true modern display size for gamers as its not to big and not to small.
> 
> Hopefully as the market gets saturated with 27"s we will see them mix it up with 32"s and maybe a real good high refresh OLED ta boot.


agree with you there I am still shocked at how the oled lg has no backlight bleed or picture quality issues like most ips screens I have used in the past. 4k 144hz from asus is what I am looking at for my 2nd display I compared the 5k up2715k to the oled from LG and the oled wins hands down I would like to see how 8k is in person though.


----------



## sblantipodi

Quote:


> Originally Posted by *Kommanche*
> 
> To be honest with you, I think OLED will probably be skipped in favour of inferior, but still HDR capable, LCD panels. The fact the Dell scrapped their OLED monitor, and Samsung aren't manufacturing OLED for anything other than mobile says a lot.


who want to spend a lot of money for a display that changes its characteristic in a couple of years,
that have burn in problems and that ages faster than what new technology can offer.

OLED is proved to be a stupid tech,
Samsung still continue to create OLES displays without burn in problems, but it doesn't succeded yet.

IPS HDR is the feature.


----------



## boredgunner

Quote:


> Originally Posted by *sblantipodi*
> 
> who want to spend a lot of money for a display that changes its characteristic in a couple of years,
> that have burn in problems and that ages faster than what new technology can offer.
> 
> OLED is proved to be a stupid tech,
> Samsung still continue to create OLES displays without burn in problems, but it doesn't succeded yet.
> 
> IPS HDR is the feature.


Recent OLED panels like LG's 2016 models (ignoring 2017 because it's too new to judge) have not been shown to have any of those problems. OLED is proven to be superior tech. I suggest you pull your head out of the dirt.


----------



## Astreon

Either way, AUO sells their best panels to Acer, not to Asus, so I'd probably go for X27 instead of PG27UQ.


----------



## sblantipodi

Quote:


> Originally Posted by *boredgunner*
> 
> Recent OLED panels like LG's 2016 models (ignoring 2017 because it's too new to judge) have not been shown to have any of those problems. OLED is proven to be superior tech. I suggest you pull your head out of the dirt.


surely, they are too young, wait 2018 and you'll see that some 2016 panel will show some burn in problems.


----------



## mmms

After seeing Sony Z9D with 600 dimming zones for the 65'' and Sony X940E with 250 dimming zones for 75'' , i think 384 dimming zones for 27'' size will be enough for this small monitor even if the panel is IPS and not VA .


----------



## Astreon

how's the haloing effect for such a large number of dimming zones?


----------



## mmms

Quote:


> Originally Posted by *Astreon*
> 
> how's the haloing effect for such a large number of dimming zones?


Don't worry about the haloing effect because FALD has developed much more than it was . As far as best number of zones, most is best .
You can see some pictures from Sony XBR75X940E with almost 250 dimming zones here :-




















So i think 384 dimming zones will be enough for 27'' and IPS panel in addition to giving us great black levels .


----------



## juano

Slight update (or continued confirmation) on release date from nVidia today. "The Acer Predator X27 (left) and the ASUS ROG Swift PG27UQ (right) are targeted for availability later this summer."


----------



## FearlessBelgian

Quote:


> Originally Posted by *mmms*
> 
> Don't worry about the haloing effect because FALD has developed much more than it was . As far as best number of zones, most is best .
> You can see some pictures from Sony XBR75X940E with almost 250 dimming zones here :-
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So i think 384 dimming zones will be enough for 27'' and IPS panel in addition to giving us great black levels .


Look at that : 




In SDR : 




No halo you said ?


----------



## Astreon

Quote:


> Originally Posted by *FearlessBelgian*
> 
> Look at that :


This is absolutely DREADFUL.


----------



## ILoveHighDPI

Ouch. HDR should be left to OLED.
If LCD can "try" do to HDR on a per-pixel basis like OLED, even if the contrast ratio isn't amazing, that would probably be good enough for me.
Normal LCD backlight bleed with the brightness turned up isn't nearly that bad.


----------



## boredgunner

Quote:


> Originally Posted by *ILoveHighDPI*
> 
> Ouch. HDR should be left to OLED.
> If LCD can "try" do to HDR on a per-pixel basis like OLED, even if the contrast ratio isn't amazing, that would probably be good enough for me.
> Normal LCD backlight bleed with the brightness turned up isn't nearly that bad.


It'd result in amazing contrast if they could do such a thing. Panasonic had prototype IPS technology that would do just that, but it probably won't take off which is fine by me since OLED and similar (e.g. microLED) are far better anyway.


----------



## ToTheSun!

Quote:


> Originally Posted by *boredgunner*
> 
> It'd result in amazing contrast if they could do such a thing. Panasonic had prototype IPS technology that would do just that, but it probably won't take off which is fine by me since OLED and similar (e.g. microLED) are far better anyway.


I don't know, man. Competing technologies is always good. Plus, I'm POORtuguese. If they came out with a 4K120 super-IPS monitor with 10K+ contrast for half the price of future 4K120 OLED, I'd totally be on board! At least no APL limitations.


----------



## boredgunner

Quote:


> Originally Posted by *ToTheSun!*
> 
> I don't know, man. Competing technologies is always good. Plus, I'm POORtuguese. If they came out with a 4K120 super-IPS monitor with 10K+ contrast for half the price of future 4K120 OLED, I'd totally be on board! At least no APL limitations.


I do hope for competing technologies, but mostly OLED and different microLED implementations including Samsung's prototype QLED (not their 2017 "QLED" crap).


----------



## Egzi

No news about the 27 1440p version release date yet?


----------



## boredgunner

Quote:


> Originally Posted by *Egzi*
> 
> No news about the 27 1440p version release date yet?


I am guessing AU Optronics is done with 1440p, save for competitive gaming purposes (I expect 1440p 240 Hz down the line). The closest thing is the Samsung CHG70 which has both 27" and 31.5" 1440p models, 144 Hz refresh rate, quantum dot, HDR, blur reduction, and there will be FreeSync 2 and G-SYNC versions. However it uses a VA panel and I don't believe they have full array local dimming.

1440p was just an intermediary step between 1080p and 4k, since 1440p is something exclusive to PC monitors rather than an industry display standard.


----------



## animeowns

Quote:


> Originally Posted by *juano*
> 
> Slight update (or continued confirmation) on release date from nVidia today. "The Acer Predator X27 (left) and the ASUS ROG Swift PG27UQ (right) are targeted for availability later this summer."


with how long its taking I Might just buy the dell 8k panel and call it a day.


----------



## Egzi

Quote:


> Originally Posted by *boredgunner*
> 
> I am guessing AU Optronics is done with 1440p, save for competitive gaming purposes (I expect 1440p 240 Hz down the line). The closest thing is the Samsung CHG70 which has both 27" and 31.5" 1440p models, 144 Hz refresh rate, quantum dot, HDR, blur reduction, and there will be FreeSync 2 and G-SYNC versions. However it uses a VA panel and I don't believe they have full array local dimming.
> 
> 1440p was just an intermediary step between 1080p and 4k, since 1440p is something exclusive to PC monitors rather than an industry display standard.


I was thinking about this one http://www.144hzmonitors.com/reviews/asus-rog-pg27vq-and-xg27vq-preview-the-latest-asus-curved-gaming-monitors/

So hope it is coming soon. Is it true that it is gonna be a TN? I read somewhere it was gonna be a IPS


----------



## mmms

Quote:


> Originally Posted by *boredgunner*
> 
> The closest thing is the Samsung CHG70 which has both 27" and 31.5" 1440p models, 144 Hz refresh rate, quantum dot, HDR, blur reduction, and there will be FreeSync 2 and G-SYNC versions. However it uses a VA panel and I don't believe they have full array local dimming.


No one tried HDR 600 nits and was happy with it from the reviews which i've seen . so $600 is a good price for this specs but it isn't true HDR .
If u really want HDR with 600 nits , u need very high contrast such as OLED , In my opinion in order to notice HDR with LED u need at least 1000 nits .


----------



## Egzi

Is the 2000$ price point confirmed for this monitor?


----------



## nodicaL

Quote:


> Originally Posted by *Egzi*
> 
> Is the 2000$ price point confirmed for this monitor?


There have been ASUS reps citing the $2000 USD on video.


----------



## l88bastar

Quote:


> Originally Posted by *nodicaL*
> 
> There have been ASUS reps citing the $2000 USD on video.


Its really $4,000 cause you have to buy 2


----------



## ToTheSun!

Quote:


> Originally Posted by *l88bastar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nodicaL*
> 
> There have been ASUS reps citing the $2000 USD on video.
> 
> 
> 
> Its really $4,000 cause you have to buy 2
Click to expand...

What are we, animals?

We have to buy 3 for surround.


----------



## mouacyk

Quote:


> Originally Posted by *ToTheSun!*
> 
> What are we, animals?
> 
> We have to buy 3 for surround.


My towel's in. Gonna have to do more laundry now.


----------



## Lass3

Quote:


> Originally Posted by *ToTheSun!*
> 
> Except they didn't. It's available now.


Yes but in 60 Hz instead of the promised 120 Hz. I'd never pay $3500 for a PC monitor with 60 Hz..


----------



## ToTheSun!

Quote:


> Originally Posted by *Lass3*
> 
> Yes but in 60 Hz instead of the promised 120 Hz. I'd never pay $3500 for a PC monitor with 60 Hz..


Would you pay $3500 for a PC monitor with 120 Hz?


----------



## Asmodian

Quote:


> Originally Posted by *ToTheSun!*
> 
> Would you pay $3500 for a PC monitor with 120 Hz?


For a 30" 4K 120Hz OLED? Instantly.


----------



## Egzi

So guys, will the 27uq use a glossy panel or mate?


----------



## boredgunner

Quote:


> Originally Posted by *Egzi*
> 
> So guys, will the 27uq use a glossy panel or mate?


Was matte when demoed, will likely remain that way. Glossy panels are never used in mainstream gaming monitors anymore.


----------



## thebski

Quote:


> Originally Posted by *boredgunner*
> 
> Was matte when demoed, will likely remain that way. Glossy panels are never used in mainstream gaming monitors anymore.


That's too bad, too. Glossy looks so much better. It would be nice if they would offer them both ways.


----------



## djfunz

Quote:


> Originally Posted by *Asmodian*
> 
> For a 30" 4K 120Hz OLED? Instantly.


Same


----------



## boredgunner

Quote:


> Originally Posted by *thebski*
> 
> That's too bad, too. Glossy looks so much better. It would be nice if they would offer them both ways.


For $2k we should get the best of both worlds-AR treated glass. Otherwise I agree with you, glossy is appealing for high end entertainment displays since it looks much better and I figure many of us are willing to adjust our lighting to suit a glossy display (as I do, just using a bias light when gaming).


----------



## Egzi

Quote:


> Originally Posted by *boredgunner*
> 
> Was matte when demoed, will likely remain that way. Glossy panels are never used in mainstream gaming monitors anymore.


Aint the asus pg279q glossy? I know their TN panels are all Mate, had to sell my last one. Used a way to strong AG coating.


----------



## boredgunner

Quote:


> Originally Posted by *Egzi*
> 
> Aint the asus pg279q glossy? I know their TN panels are all Mate, had to sell my last one. Used a way to strong AG coating.


It is "light AG" like almost all recent IPS monitors. Glossy if looked at a steep horizontal angle, matte when viewed directly.


----------



## jologskyblues

I hate seeing my reflection on glossy screens so I prefer matte screens.


----------



## Lass3

Quote:


> Originally Posted by *ToTheSun!*
> 
> Would you pay $3500 for a PC monitor with 120 Hz?


No, I'd probably wait till it's below 3000 and first batches are gone. QC on first batches are often bad. Rushed.

And it needs to be proper OLED or mLED before I'm willing to pay that amount, obviously.

I'm not going 60 Hz ever again on PC. It has always been a compromise. I want atleast 100 Hz.


----------



## sblantipodi

Quote:


> Originally Posted by *Asmodian*
> 
> For a 30" 4K 120Hz OLED? Instantly.


I'm sorry but said from someone with a mainstream rig and a cheap Acer monitor...
... Is not very credible ..


----------



## bigboy678

Quote:


> Originally Posted by *boredgunner*
> 
> I am guessing AU Optronics is done with 1440p, save for competitive gaming purposes (I expect 1440p 240 Hz down the line). The closest thing is the Samsung CHG70 which has both 27" and 31.5" 1440p models, 144 Hz refresh rate, quantum dot, HDR, blur reduction, and there will be FreeSync 2 and G-SYNC versions. However it uses a VA panel and I don't believe they have full array local dimming.
> 
> 1440p was just an intermediary step between 1080p and 4k, since 1440p is something exclusive to PC monitors rather than an industry display standard.


You mention that the Samsung CHG70 has a GSYNC version but the only one i have been able to see are the freesync 2 ones. Are they releasing a GSYNC version later on?


----------



## boredgunner

Quote:


> Originally Posted by *bigboy678*
> 
> You mention that the Samsung CHG70 has a GSYNC version but the only one i have been able to see are the freesync 2 ones. Are they releasing a GSYNC version later on?


It might just be a rumor for now about the CHG70, based on Samsung announcing G-SYNC versions of some of their other gaming monitors.


----------



## pez

Quote:


> Originally Posted by *sblantipodi*
> 
> I'm sorry but said from someone with a mainstream rig and a cheap Acer monitor...
> ... Is not very credible ..


TIL i7s and Titan GPUs are maintstream. Not like people buy $4k OLED TVs only to run them off of devices that cost pennies on the dollar in comparison to those.


----------



## Egzi

Any release date news on the 1440p PG27VQ? Does anyone know if it will use a light mate coating?


----------



## l88bastar

Quote:


> Originally Posted by *Egzi*
> 
> Any release date news on the 1440p PG27VQ? Does anyone know if it will use a light mate coating?


Please don't fowl this thread with 1440p....that is really gross sir!


----------



## Jbravo33

Quote:


> Originally Posted by *Egzi*
> 
> Any release date news on the 1440p PG27VQ? Does anyone know if it will use a light mate coating?


i google pretty much everyday like a loser, still havent found anything.


----------



## animeowns

Quote:


> Originally Posted by *jologskyblues*
> 
> I hate seeing my reflection on glossy screens so I prefer matte screens.


according to the asus Rep I talked with they said the PG27uQ should be released sometime in between August and September 2017


----------



## profundido

You know, I've never been more anticipating a hardware upgrade than this monitor. I have such high hopes for this monitor that I almost for sure will be dissapointed.

Here I am still impatiently sitting on my beautiful oversized rig that I built last year that currently still drives the precedessor 4K monitor from Asus (pg27aq) without flinching through any game at max frames, completely underused resource wise with all fans at an unhearable low 600RPM. I litteraly cannot wait to lift that 60Hz barrier and finally see my system actually being used for once ! PLEEEEEAAASE Asus make fast and more even: make it good !

Let the fans whistle and the water rumble with 4K 144Hz !


----------



## profundido

Quote:


> Originally Posted by *Jbravo33*
> 
> i google pretty much everyday like a loser, still havent found anything.


Thx, at least now I know I'm not the only -feeling silly- one...=P


----------



## animeowns

Quote:


> Originally Posted by *profundido*
> 
> Thx, at least now I know I'm not the only -feeling silly- one...=P


I got tired of waiting and ended up buying the dell 8k monitor gives your video cards a real workout I will still get the Asus 4k 144hz if it ever releases.


----------



## StreekG

Lol 4K at 144 is going to take some serious horsepower.


----------



## boredgunner

Quote:


> Originally Posted by *StreekG*
> 
> Lol 4K at 144 is going to take some serious horsepower.


Use G-SYNC in the meantime and enjoy your 50-70 FPS in modern poorly optimized rushed AAA games without any tearing. And for older masterpieces, 4k 100+ FPS will be no problem even for one GTX 1080 like mine, and with ULMB it will be visual bliss.


----------



## l88bastar

Quote:


> Originally Posted by *boredgunner*
> 
> Use G-SYNC in the meantime and enjoy your 50-70 FPS in modern poorly optimized rushed AAA games without any tearing. And for older masterpieces, 4k 100+ FPS will be no problem even for one GTX 1080 like mine, and with ULMB it will be visual bliss.


What older Masterpieces you playin playa?


----------



## istudy92

Quote:


> Originally Posted by *l88bastar*
> 
> What older Masterpieces you playin playa?


Minecraft is it not obvious?


----------



## boredgunner

Quote:


> Originally Posted by *istudy92*
> 
> Minecraft is it not obvious?


Even more obvious since most are listed in my signature.









As long as you plan to keep the monitor for more than one GPU upgrade, and as long as the monitor isn't crap, it is justifiable.


----------



## degenn

Bleh... 27" 4K panels are puny, what a waste. Patiently waiting for at least a 32" model.... hope I don't have to wait too long.


----------



## l88bastar

Quote:


> Originally Posted by *boredgunner*
> 
> Even more obvious since most are listed in my signature.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As long as you plan to keep the monitor for more than one GPU upgrade, and as long as the monitor isn't crap, it is justifiable.


Wuaaahhh no Halo CE mon?

Quote:


> Originally Posted by *degenn*
> 
> Bleh... 27" 4K panels are puny, what a waste. Patiently waiting for at least a 32" model.... hope I don't have to wait too long.


Now ya know dem displaymakers be irie mon....when da las time dey give us wat we wan??? Nevahyamind!


----------



## III-Method-III

I hope you dont talk like that in real life to, you know, people with ears.


----------



## Lass3

Man.. Those bezels


----------



## boredgunner

Quote:


> Originally Posted by *degenn*
> 
> Bleh... 27" 4K panels are puny, what a waste. Patiently waiting for at least a 32" model.... hope I don't have to wait too long.


I would definitely like 32" or even 40" as well, but I don't have the patience.


----------



## Jbravo33

Quote:


> Originally Posted by *III-Method-III*
> 
> I hope you dont talk like that in real life to, you know, people with ears.


a jamaican accent bothers your ears? i can only imagine what they do to your eyes.... wheres the thumbs down button.

anyways still no info on this....sigh


----------



## st0necold

Guys is this out yet? Every time they announce it it seems like I need to take out a life insurance policy incase I pass away from old age so my future child could inherit it. 3 years in a row 144hz 4k is coming out and still nothing.

keep announcing stuff try making it.


----------



## l88bastar

Everytime I see this thread show up in the news box, I think today's the day........nope


----------



## Neo_Morpheus

yeah go for the bigger 4k screens, 32"+ and not the extra wide ones.


----------



## Asmodian

I am not sure, it would be great if we could stop seeing pixels on our desktops, the way mobile has gone.

I plan to get this display, 4K at 32" or above has visible pixels at the viewing distance.

Scaling is still bad enough that I would probably get at 32 or 40" option it it was available but I am still interested in this one, it should be ideal for gaming and hopefully everything I use will soon support scaling properly.


----------



## KGPrime

Wouldn't go above 27" for 4k personally or below 1440p at 24" nor 1080p above 22".


----------



## lonsor

4K, 10-bit, IPS, HDR, 240Hz or bust!


----------



## Asmodian

Quote:


> Originally Posted by *lonsor*
> 
> 4K, 10-bit, IPS, HDR, 240Hz or bust!


LOL.









But to be honest, 240 Hz is not THAT important and it is tricky to do at 4K, even with DP 1.4.









(600 Hz is required to be able to play any video standard without judder, 300 Hz isn't divisible by 24 fps and 240 Hz isn't divisible by 25 fps, so I am not sure settling for something below that is a good idea)


----------



## boredgunner

Quote:


> Originally Posted by *lonsor*
> 
> 4K, 10-bit, IPS *OLED*, HDR, 240Hz or bust!


Fixed that for you. But while we're at it, why not 5k or 8k, 12-bit color depth, true QLED, and 600 Hz?
Quote:


> Originally Posted by *Asmodian*
> 
> LOL.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But to be honest, 240 Hz is not THAT important and it is tricky to do at 4K, even with DP 1.4.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (600 Hz is required to be able to play any video standard without judder, 300 Hz isn't divisible by 24 fps and 240 Hz isn't divisible by 25 fps, so I am not sure settling for something below that is a good idea)


Yeah but this is just a monitor. Although extremely high refresh rate is better for other reasons like less input lag and less eye fatigue when using strobing. Speaking of which, it is confirmed that the Acer counterpart to this monitor has ULMB, although we don't know anything about it yet.

Also what content is 25 FPS?


----------



## Asmodian

Quote:


> Originally Posted by *boredgunner*
> 
> Also what content is 25 FPS?


It is the standard broadcast frame rate for historically PAL regions (Europe, China, Australia, eastern and southern Africa, Argentina, and Greenland).


----------



## Baasha

Quote:


> Originally Posted by *st0necold*
> 
> Guys is this out yet? Every time they announce it *it seems like I need to take out a life insurance policy incase I pass away from old age* so my future child could inherit it. 3 years in a row 144hz 4k is coming out and still nothing.
> 
> keep announcing stuff try making it.


LOOOOOOOOOOOOOL

Remember the release of the Asus RoG Swift PG278Q?

I wouldn't be surprised if this release cycle was worse. It was so difficult to get those monitors but I was able to cop 3 for some 144Hz G-Sync Surround madness!









If this monitor makes it to market this year, I'd be amazed.


----------



## animeowns

Quote:


> Originally Posted by *III-Method-III*
> 
> I hope you dont talk like that in real life to, you know, people with ears.


He's a giraffe that can actually talk to humans and type on a keyboard


----------



## Egzi

Quote:


> Originally Posted by *Baasha*
> 
> LOOOOOOOOOOOOOL
> 
> Remember the release of the Asus RoG Swift PG278Q?
> 
> I wouldn't be surprised if this release cycle was worse. It was so difficult to get those monitors but I was able to cop 3 for some 144Hz G-Sync Surround madness!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If this monitor makes it to market this year, I'd be amazed.


Acer Delays Predator X27 4k HDR GSync monitor to next year http://www.guru3d.com/news-story/acer-delays-predator-x27-4k-hdr-gsync-monitor-to-next-year.html

Maybe the same with Asus?


----------



## PraetorianP

Duplicate post. Please delete


----------



## boredgunner

Surprise surprise. I'm guessing this will apply to ASUS as well as I'm guessing the reason for this stems from AUO, and both monitors are nearly identical.


----------



## TheWizardMan

Guess now I'll need to buy a 2K gsync monitor. Been waiting for this instead. Stupid.


----------



## l88bastar

Quote:


> Originally Posted by *TheWizardMan*
> 
> Guess now I'll need to buy a 2K gsync monitor. Been waiting for this instead. Stupid.


I love my C7 OLED....but was missing Gsync & high refresh badly.....so I picked up a PG278QR two weeks ago. I have tried literally every gaming display (from OCd catleaps to 165hz IPS to all of the ultrawides) and while this 8QR has its TN drawbacks....dang it is the absolute best gaming display I have ever owned. I actually like it better than my FW900....then again my FW900 is really old and on its last legs









Looks like I will have to be using it much longer than planned


----------



## TheWizardMan

Quote:


> Originally Posted by *l88bastar*
> 
> I love my C7 OLED....but was missing Gsync & high refresh badly.....so I picked up a PG278QR two weeks ago. I have tried literally every gaming display (from OCd catleaps to 165hz IPS to all of the ultrawides) and while this 8QR has its TN drawbacks....dang it is the absolute best gaming display I have ever owned. I actually like it better than my FW900....then again my FW900 is really old and on its last legs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like I will have to be using it much longer than planned


Was just about to pull the trigger on this panel. Did you have any dead pixels or backlight bleed issues?


----------



## besthijacker

Vaporware everywhere.


----------



## bee144

Typical NVIDIA vapor ware. Lmao delayed until March 2018 at the latest. That's assuming there aren't further delays ?


----------



## Egzi

Quote:


> Originally Posted by *l88bastar*
> 
> I love my C7 OLED....but was missing Gsync & high refresh badly.....so I picked up a PG278QR two weeks ago. I have tried literally every gaming display (from OCd catleaps to 165hz IPS to all of the ultrawides) and while this 8QR has its TN drawbacks....dang it is the absolute best gaming display I have ever owned. I actually like it better than my FW900....then again my FW900 is really old and on its last legs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like I will have to be using it much longer than planned


I really liked the PG278QR. If it just did not use that dreadful aggressive coating. If it used a lite mate panel,
I would never visit these forums again until OLED monitors where out, hehe


----------



## djfunz

I have to admit, I gave up waiting and just pulled the trigger on a local used X34 for $700. These monitors are taking forever to release.


----------



## ToTheSun!

Quote:


> Originally Posted by *Egzi*
> 
> I really liked the PG278QR. If it just did not use that dreadful aggressive coating. If it used a lite mate panel,
> I would never visit these forums again until OLED monitors where out, hehe


Why don't you go for the S2716DG, then? Same panel with lighter coating.


----------



## l88bastar

Quote:


> Originally Posted by *Egzi*
> 
> I really liked the PG278QR. If it just did not use that dreadful aggressive coating. If it used a lite mate panel,
> I would never visit these forums again until OLED monitors where out, hehe


Yep....but I knew what I was getting into when I bought it. I get my glossy fix from my C7 OLED and my Gsync Blur Free Goodness from the 8QR.....I wish I could smash them together in the Hadron collider









Well....I guess I am done with this thread for another 6-9 months


----------



## st0necold

Quote:


> Originally Posted by *Baasha*
> 
> LOOOOOOOOOOOOOL
> 
> Remember the release of the Asus RoG Swift PG278Q?
> 
> I wouldn't be surprised if this release cycle was worse. It was so difficult to get those monitors but I was able to cop 3 for some 144Hz G-Sync Surround madness!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *If this monitor makes it to market this year, I'd be amazed.
> 
> 
> 
> 
> 
> 
> 
> *


I remember man I hope this isn't the case!


----------



## TheWizardMan

Quote:


> Originally Posted by *ToTheSun!*
> 
> Why don't you go for the S2716DG, then? Same panel with lighter coating.


Just bought the dell.


----------



## pez

Quote:


> Originally Posted by *djfunz*
> 
> I have to admit, I gave up waiting and just pulled the trigger on a local used X34 for $700. These monitors are taking forever to release.


I love the x34 even at MSRP. That's a pretty great deal.


----------



## Egzi

Quote:


> Originally Posted by *ToTheSun!*
> 
> Why don't you go for the S2716DG, then? Same panel with lighter coating.


Not a bad ide actually


----------



## l88bastar

Quote:


> Originally Posted by *Egzi*
> 
> Not a bad ide actually


Yea but only 144hz....the 165hz motion clarity on the PG278QR brings it to near CRT levels and outweighs its heavier AG IMHO


----------



## boredgunner

Quote:


> Originally Posted by *l88bastar*
> 
> Yea but only 144hz....the 165hz motion clarity on the PG278QR brings it to near CRT levels and outweighs its heavier AG IMHO


The only way an LCD can get close to CRT motion clarity is with strobing, and strobing on both of those only work up to 120 Hz right?


----------



## l88bastar

Quote:


> Originally Posted by *boredgunner*
> 
> The only way an LCD can get close to CRT motion clarity is with strobing, and strobing on both of those only work up to 120 Hz right?


Yea but I am not a big fan of strobing on TN panels. Gsync 165 with FPS capped @160 is fantastic for FPS and provides "near CRT" clarity. I prefer it over my FW900 nowadays.

Everything else gets played on the OLED 4k


----------



## Recipe7

I've been putting away 250USD every month for this monitor, but I'm still weary of spending 2000USD on a monitor. All these delays are allowing me to save even more money so that when this monitor finally comes out, it won't be so hard to fork over the money for it. They are doing this intentionally to me,


----------



## l88bastar

Quote:


> Originally Posted by *Recipe7*
> 
> I've been putting away 250USD every month for this monitor, but I'm still weary of spending 2000USD on a monitor. All these delays are allowing me to save even more money so that when this monitor finally comes out, it won't be so hard to fork over the money for it. They are doing this intentionally to me,


https://postimg.org/image/5j2tx32yt/


----------



## nodicaL

Quote:


> Originally Posted by *l88bastar*
> 
> Yea but I am not a big fan of strobing on TN panels. Gsync 165 with FPS capped @160 is fantastic for FPS and provides "near CRT" clarity. I prefer it over my FW900 nowadays.
> 
> Everything else gets played on the OLED 4k


There's a big difference between 240hz G-Sync and 144hz ULMB. I've tried ULMB and don't like how much stuttering there is between frames if that makes sense. It's almost too clear and you notice the lack of frames.


----------



## ToTheSun!

Quote:


> Originally Posted by *nodicaL*
> 
> There's a big difference between 240hz G-Sync and 144hz ULMB. I've tried ULMB and don't like how much stuttering there is between frames if that makes sense. It's almost too clear and you notice the lack of frames.


Response time is the most natural form of frame interpolation, I suppose.


----------



## hteng

now i won't feel as bad since i just bit the bullet and bought a PG279Q, am very satisfied with it despite it having some backlight bleeding. Hopefully this new monitor won't exhibit the same issues.


----------



## TheWizardMan

Quote:


> Originally Posted by *hteng*
> 
> now i won't feel as bad since i just bit the bullet and bought a PG279Q, am very satisfied with it despite it having some backlight bleeding. Hopefully this new monitor won't exhibit the same issues.


I bought the Dell 2176DG and it's good. I'm moving from a 60hz 4K monitor. I'm pretty happy but I miss that 4K poppy picture. Still, I'm happy with the dell and it will work until this monitor is released.


----------



## ILoveHighDPI

Quote:


> Originally Posted by *Recipe7*
> 
> I've been putting away 250USD every month for this monitor, but I'm still weary of spending 2000USD on a monitor. All these delays are allowing me to save even more money so that when this monitor finally comes out, it won't be so hard to fork over the money for it. They are doing this intentionally to me,


Samsung and LG both finished multi billion dollar OLED factories this year.
HDMI 2.1 supports up to 8K 120hz, with Variable Refresh Rate support.

I still say anyone getting a monitor this year is going to regret it (except when I bought the Monoprice 144hz 1440p monitor for $250, that was too good to pass up).


----------



## Drome

Quote:


> Originally Posted by *ILoveHighDPI*
> 
> Samsung and LG both finished multi billion dollar OLED factories this year.
> HDMI 2.1 supports up to 8K 120hz, with Variable Refresh Rate support.
> 
> I still say anyone getting a monitor this year is going to regret it (except when I bought the Monoprice 144hz 1440p monitor for $250, that was too good to pass up).


It's a nightmare to even get a decent VA or IPS monitor atm. It's gonna be years until we see good OLED monitors at prices normal people can afford.


----------



## ToTheSun!

Quote:


> Originally Posted by *ILoveHighDPI*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Recipe7*
> 
> I've been putting away 250USD every month for this monitor, but I'm still weary of spending 2000USD on a monitor. All these delays are allowing me to save even more money so that when this monitor finally comes out, it won't be so hard to fork over the money for it. They are doing this intentionally to me,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Samsung and LG both finished multi billion dollar OLED factories this year.
> HDMI 2.1 supports up to 8K 120hz, with Variable Refresh Rate support.
> 
> I still say anyone getting a monitor this year is going to regret it (except when I bought the Monoprice 144hz 1440p monitor for $250, that was too good to pass up).
Click to expand...

Not to mention 2018 is going to be a tick year for OLED, as LG is implementing (according to them, on their newest panels) new material to produce better and brighter blues with longer lifetimes. On top of Game VRR and 4K120, 2018 is supposed to bring higher peak brightness to OLED.


----------



## profundido

Quote:


> Originally Posted by *boredgunner*
> 
> Use G-SYNC in the meantime and enjoy your 50-70 FPS in modern poorly optimized rushed AAA games without any tearing. And for older masterpieces, 4k 100+ FPS will be no problem even for one GTX 1080 like mine, and with ULMB it will be visual bliss.


actually surprisingly the CPU singlethread performance is the big bottleneck there. I play several of those on 4K and the gpu's indeed laugh at it but the poor coding where they used to program all in 1 thread created a bottleneck we'll never get rid of


----------



## Recipe7

Quote:


> Originally Posted by *l88bastar*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> https://postimg.org/image/5j2tx32yt/


Looks like you are buying 2, possibly 2 and a half


----------



## Recipe7

Quote:


> Originally Posted by *l88bastar*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> https://postimg.org/image/5j2tx32yt/


Quote:


> Originally Posted by *ILoveHighDPI*
> 
> Samsung and LG both finished multi billion dollar OLED factories this year.
> HDMI 2.1 supports up to 8K 120hz, with Variable Refresh Rate support.
> 
> I still say anyone getting a monitor this year is going to regret it (except when I bought the Monoprice 144hz 1440p monitor for $250, that was too good to pass up).


By the way you are phrasing it, it seems my 250 will have to keep going to just pay for those oled monitors, and I still may even be short


----------



## CallsignVega

Quote:


> Originally Posted by *ToTheSun!*
> 
> Not to mention 2018 is going to be a tick year for OLED, as LG is implementing (according to them, on their newest panels) new material to produce better and brighter blues with longer lifetimes. On top of Game VRR and 4K120, 2018 is supposed to bring higher peak brightness to OLED.


LG only produces white OLED's, there are no "blue lifespan" issues. Unless you are talking about TADF, something that is totally different than current LG OLED's.


----------



## ToTheSun!

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ToTheSun!*
> 
> Not to mention 2018 is going to be a tick year for OLED, as LG is implementing (according to them, on their newest panels) new material to produce better and brighter blues with longer lifetimes. On top of Game VRR and 4K120, 2018 is supposed to bring higher peak brightness to OLED.
> 
> 
> 
> LG only produces white OLED's, there are no "blue lifespan" issues. Unless you are talking about TADF, something that is totally different than current LG OLED's.
Click to expand...

Yes, that's what I meant.

I should have clarified that the "longer lifetimes" part was independent from the "blue" part.


----------



## jezzer

To bad this monitor has been delayed to somewhere in 2018


----------



## TheWizardMan

Quote:


> Originally Posted by *jezzer*
> 
> To bad this monitor has been delayed to somewhere in 2018


It's very upsetting.


----------



## Baasha

Asus and their releases.









There needs to be some industry standard for timeframes concerning releases; announce then release within 3 months. This announce 1+ year and then disappear for 8+ months with no new info is highly aggravating.

They could've made a killing by releasing this monitor this month - with Threadripper out and Skylake-X around the corner, tons of people will be building new rigs (cough cough) - and would be looking for new panels.

Sigh... in the mean time, I guess I'm stuck with my 8K and 4K OLED monitors.


----------



## ILoveHighDPI

Quote:


> Originally Posted by *Recipe7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *l88bastar*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> https://postimg.org/image/5j2tx32yt/
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *ILoveHighDPI*
> 
> Samsung and LG both finished multi billion dollar OLED factories this year.
> HDMI 2.1 supports up to 8K 120hz, with Variable Refresh Rate support.
> 
> I still say anyone getting a monitor this year is going to regret it (except when I bought the Monoprice 144hz 1440p monitor for $250, that was too good to pass up).
> 
> Click to expand...
> 
> By the way you are phrasing it, it seems my 250 will have to keep going to just pay for those oled monitors, and I still may even be short
Click to expand...

We should be seeing OLED's selling for less, though I suppose at the introduction of HDMI2.1 there's no guarantee that the specific combination of OLED+4K+120hz won't cost an arm and a leg.
I'm sure the manufacturers are aware this is the "Ultimate" display for most people.

If we're lucky the Japanese will be trying to make 120hz ubiquitous before 2020 when they're supposed to be broadcasting the Tokyo Olympics in "Super Hi-Vision" 8K 120hz (at least locally in Japan).
We'll see how far that makes it outside Japan, but at least I would imagine 4K 120hz should see some sort of broad push internationally.


----------



## d5aqoep

I ordered LG 27UD68P and called it a day. It has good colors and suits my needs. I don’t want to beta test ASUS crap any more. Let 4K @ 120hz technology become mature first.


----------



## ToTheSun!

Quote:


> Originally Posted by *d5aqoep*
> 
> I ordered LG 27UD68P and called it a day.


Not even remotely comparable, though.

Also, there's a new version of that monitor already.


----------



## sblantipodi

Those monitors are getting old before they are ready to be sold, HDR10 is an old standard, long life to HDR10+
https://news.samsung.com/global/how-hdr10-creates-like-real-images


----------



## profundido

Quote:


> Originally Posted by *d5aqoep*
> 
> I ordered LG 27UD68P and called it a day. It has good colors and suits my needs. I don't want to beta test ASUS crap any more. Let 4K @ 120hz technology become mature first.


Same, I ordered myself a Viewsonic xg2703-gs and put my mind to rest until maybe....if.....when...Asus decides to release that monitor.....somewhere in mid-late 2018
















It doesn't have 4K but ticks all the other boxes for me. Also extremely happy with their QC (as opposed to the horror stories of Asus/Acer). No bleeding, lines or errors. Couldn't be happier with my 1440p @165hz experience !


----------



## TheWizardMan

Quote:


> Originally Posted by *profundido*
> 
> Same, I ordered myself a Viewsonic xg2703-gs and put my mind to rest until maybe....if.....when...Asus decides to release that monitor.....somewhere in mid-late 2018
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It doesn't have 4K but ticks all the other boxes for me. Also extremely happy with their QC (as opposed to the horror stories of Asus/Acer). No bleeding, lines or errors. Couldn't be happier with my 1440p @165hz experience !


Yep. Got myself a Dell 2716DG for $450 from Best Buy. May not be 4K, but my bank account is much happier.


----------



## KGPrime

This thing sounds good on paper, but after 15 years of lcd fail i'm pretty sure it will disappoint in some way, especially for the money. So Dell S2417dg on it's way.

Done with fretting about the lcd monitor game, which is funny as i already knew this was probably going to happen anyway, but i still held out and hadn't even bought any lcd yet since my last Fw900 died. Using this free 10 year old 20" 1600x900 TN ccfl ( with glossy ar coated screen though







But it's been over a year and i figured it would end up being the Dell anyway since it was released but continued to hold out. So many times i almost pulled the trigger on the Acer XB270HU, or the PG279q, knowing i would not be able to tolerate the Ips glow even if i did get a "good one" and would probably get a bad sample anyway, so this morning before i went to work all groggy an half awake i saw the Dell at 379 for labor day sale with code or whatever and hit the buy button, jammed my CC number in there and ran out the door as fast as possible, lol.

Anyway, for sure i'll be googling every damn review when these do finally arrive, but i don't expect much, at least for the money. So TN master race it is, lol. From Fw900's to a TN being the best choice out there for most of my needs, hilarious. I may just pick up the 24" 1440p Dell Ips sibling for cross referencing color when doing graphics stuff ( if even as i've already done thousands of hours of color and lighting on this tn just fine and it looks how i expect it to when i see it on an Ips) and deal with them for a year or two, or three...


----------



## l88bastar

I bought Cirthix's 4k 120hz kit for my 39" Seiki. For 390 bucks I got a 39" 120hz 4k VA display...can't beat that


----------



## PostalTwinkie

Quote:


> Originally Posted by *sblantipodi*
> 
> Those monitors are getting old before they are ready to be sold, HDR10 is an old standard, long life to HDR10+
> https://news.samsung.com/global/how-hdr10-creates-like-real-images


HDR10+ support can be provided via Firmware update. So long as the display has an update port on it and the manufacturer puts it out, should be able to field upgrade.


----------



## ILoveHighDPI

Quote:


> Originally Posted by *sblantipodi*
> 
> Those monitors are getting old before they are ready to be sold, HDR10 is an old standard, long life to HDR10+
> https://news.samsung.com/global/how-hdr10-creates-like-real-images


Apparently HDMI 2.1 is already planned to have a dynamic HDR implmentation: http://www.flatpanelshd.com/news.php?subaction=showfull&id=1457513362

The only reason Samsung's tech would matter is if it's fully compatible with the new HDMI 2.1 spec and they're effectively announcing support for that.

The fact that "HDR" was pushed out in such an unfinished state really makes the entire display industry look bad. I guess that's what you get when standards are designed by committee?


----------



## Asmodian

Quote:


> Originally Posted by *ILoveHighDPI*
> 
> The fact that "HDR" was pushed out in such an unfinished state really makes the entire display industry look bad. I guess that's what you get when standards are designed by committee?


This is what you get when the standard is watered down enough to allow displays not really capable of HDR to claim HDR support. The committee wasn't the problem, it was everyone involved wanting a standard that allowed TVs that weren't actually capable of HDR to claim HDR support. A standard designed by a committee that was committed to implementing a robust HDR ecosystem for the future instead of short term TV sales would have been fine.


----------



## steelbom

Nothing could even come close to getting 144Hz at 4k on high settings in any AAA game though, no? Even a Ti... I guess it sets a high bar for GPU makers to meet in the future which is always good.


----------



## Tobiman

Asus demoed the ROG XG35. 34 inch ultra wide free sync monitor. Not so sure about specs but i'm guessing it's an IPS at 100hz.


----------



## chrisnyc75

Quote:


> Originally Posted by *Asmodian*
> 
> A standard designed by a committee that was committed to implementing a robust HDR ecosystem for the future instead of short term TV sales would have been fine.


I believe they call that Dolby Vision


----------



## sblantipodi

Quote:


> Originally Posted by *PostalTwinkie*
> 
> HDR10+ support can be provided via Firmware update. So long as the display has an update port on it and the manufacturer puts it out, should be able to field upgrade.


history teaches that manufacturers prefers to create a new model, from PG27UQ to P27UQX or something similar, from HDR10 to HDR10+.
than you have spent 2000 dollars on a monitor that after few months worths the half because of the new model is out.

this market is becaming stupid, too stupid even for me that I am a stupid enthusiast.


----------



## Lass3

HDR10+ is going to win. Open standard. Dolby Vision requires proprietary chip. TV's that use this chip support HDR10 anyway. Which mean they can be updated to HDR10+


----------



## ILoveHighDPI

Quote:


> Originally Posted by *steelbom*
> 
> Nothing could even come close to getting 144Hz at 4k on high settings in any AAA game though, no? Even a Ti... I guess it sets a high bar for GPU makers to meet in the future which is always good.


As time goes on the older my average favorite game becomes.
Deus Ex Human Revolution is still better than Mankind Divided, and I can play DE:HR at 4K 120fps on the 980Ti without any issues (or 8K 60fps).

One of the first things that became apparent after playing DOOM 2016 at 4K was that resolution scaling works _much_ better the higher your native resolution is. Any amount of scaling at 1080p looks ridiculously bad, but scaling back 4K by 20% isn't horrible, I played DOOM for over a hundred hours like that, maintaining framerate above 100fps at effectively "3K" resolution, with low settings, which doesn't detract from the game in this case (of course textures don't need to be reduced for high FPS).
Once 8K monitors become common you won't bat an eye at the thought of running 30% lower than native resolution, but hopefully downscaling the entire screen won't be necessary once the industry develops some robust multi-resolution functionality.
As fanatical as I am for the coming of 8K it still makes sense that not everything needs to be rendered at maximum quality. Most people would probably never notice if the image has low resolution "vignetting" on the edges.


----------



## steelbom

Quote:


> Originally Posted by *ILoveHighDPI*
> 
> As time goes on the older my average favorite game becomes.
> Deus Ex Human Revolution is still better than Mankind Divided, and I can play DE:HR at 4K 120fps on the 980Ti without any issues (or 8K 60fps).
> 
> One of the first things that became apparent after playing DOOM 2016 at 4K was that resolution scaling works _much_ better the higher your native resolution is. Any amount of scaling at 1080p looks ridiculously bad, but scaling back 4K by 20% isn't horrible, I played DOOM for over a hundred hours like that, maintaining framerate above 100fps at effectively "3K" resolution, with low settings, which doesn't detract from the game in this case (of course textures don't need to be reduced for high FPS).
> Once 8K monitors become common you won't bat an eye at the thought of running 30% lower than native resolution, but hopefully downscaling the entire screen won't be necessary once the industry develops some robust multi-resolution functionality.
> As fanatical as I am for the coming of 8K it still makes sense that not everything needs to be rendered at maximum quality. Most people would probably never notice if the image has low resolution "vignetting" on the edges.


Yeah I guess that's true - it's viable for older games, and some that aren't too intensive. I play Ark and it's brutal... highest settings gets me 20 FPS at 1080p on my RX 480, and I want 50+ FPS at 3k. Hoping Vega 56 or 64 will get me to that.

I won't end up with a high PPI display until they make 6880x2880 ultrawides, but I think I look forward to that.


----------



## Nautilus

I look for several things in a monitor: HDR, 4K, 144Hz, min. 32", USB 3.0 hub, Display port+HDMI, IPS, no backlight bleed. This monitor doesn't check all the boxes.


----------



## trippinonprozac

Quote:


> Originally Posted by *Nautilus*
> 
> I look for several things in a monitor: HDR, 4K, 144Hz, min. 32", USB 3.0 hub, Display port+HDMI, IPS, no backlight bleed. This monitor doesn't check all the boxes.


lol no monitor checks all of those boxes.....


----------



## Asmodian

This is a high DPI monitor, a new phenomenon for desktop. I am very interested in a display with pixels you cannot see individually but I would buy the 32" version instead if it existed.









Because this monitor got delayed I ended up picking up a 55" OLED TV (LG C7P) at the spur of the moment (oops) and I am now using it as a monitor. All in all it is great display (with only ~20ms of input lag too) but the pixel quality isn't that great when viewed from only a few feet away (they are not perfectly regular), of course it also has about the same DPI as a 27" 1080p monitor. I just sit back a little, you don't need to be right up next to a 55" monitor, and enjoy the unbelievable black level and color saturation. However, for anything but watching video, the low refresh rate (60 Hz) is painful after being on 120+ Hz for so long (my first high refresh rate LCD was an Overlord, then an Asus VG278H, an Asus PG278Q, and most recently a Acer XB270HU bprz). It does accept 120Hz at 1080p but you would need to sit very far away to find a 1080p 55" display acceptable.

I was very excited by this monitor and planned to buy one as soon as possible but now I am wondering when a 4K 120+ Hz OLED display with G-sync will be available. I am not sure even local dimming with be enough for me to be able to go back to IPS. My Acer XB270HU isn't one with bad IPS glow, I used to be very happy with it, but it looks absolutely terrible next to OLED.


----------



## profundido

Quote:


> Originally Posted by *Asmodian*
> 
> This is a high DPI monitor, a new phenomenon for desktop. I am very interested in a display with pixels you cannot see individually but I would buy the 32" version instead if it existed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Because this monitor got delayed I ended up picking up a 55" OLED TV (LG C7P) at the spur of the moment (oops) and I am now using it as a monitor. All in all it is great display (with only ~20ms of input lag too) but the pixel quality isn't that great when viewed from only a few feet away (they are not perfectly regular), of course it also has about the same DPI as a 27" 1080p monitor. I just sit back a little, you don't need to be right up next to a 55" monitor, and enjoy the unbelievable black level and color saturation. However, for anything but watching video, the low refresh rate (60 Hz) is painful after being on 120+ Hz for so long (my first high refresh rate LCD was an Overlord, then an Asus VG278H, an Asus PG278Q, and most recently a Acer XB270HU bprz). It does accept 120Hz at 1080p but you would need to sit very far away to find a 1080p 55" display acceptable.
> 
> I was very excited by this monitor and planned to buy one as soon as possible but now I am wondering when a 4K 120+ Hz OLED display with G-sync will be available. I am not sure even local dimming with be enough for me to be able to go back to IPS. My Acer XB270HU isn't one with bad IPS glow, I used to be very happy with it, but it looks absolutely terrible next to OLED.


A bit of first hand experience I want to share with you: Just like 100/120+fps is truly a thing to behold (after which you don't wanna go back anymore!), so is 160+ppi. 4K @ 27" is such an experience which suddenly changes AAA graphics into 'photorealistic' as in 'feels like in real life". I personally own both features in 2 separate monitors. on the 4K 27" IPS monitor I've instinctively let go of the mouse several times to simply stare and gaze at some high res scenes and textures in "Rise of the Tomb raider" or "The witcher 3" or "The Elder scrolls online". I also had a 4K on 32" monitor (Asus Pro Art)for a day and the loss of ppi there immediately killed the experience for me. It felt like I was back on 1440p @ 27". I felt I could "see" pixels again whereas on 27" I can no longer (just like in real life)

I truly believe that -IF ASUS GETS ITS QC RIGHT THIS TIME- this monitor which will bring those 2 features together (144fps + 167ppi) will truly be an amazing experience when paired with a powerful machine. I personally cannot wait until I get to lay hands on it


----------



## Jbravo33

Quote:


> Originally Posted by *profundido*
> 
> A bit of first hand experience I want to share with you: Just like 100/120+fps is truly a thing to behold (after which you don't wanna go back anymore!), so is 160+ppi. 4K @ 27" is such an experience which suddenly changes AAA graphics into 'photorealistic' as in 'feels like in real life". I personally own both features in 2 separate monitors. on the 4K 27" IPS monitor I've instinctively let go of the mouse several times to simply stare and gaze at some high res scenes and textures in "Rise of the Tomb raider" or "The witcher 3" or "The Elder scrolls online". I also had a 4K on 32" monitor (Asus Pro Art)for a day and the loss of ppi there immediately killed the experience for me. It felt like I was back on 1440p @ 27". I felt I could "see" pixels again whereas on 27" I can no longer (just like in real life)
> 
> I truly believe that -IF ASUS GETS ITS QC RIGHT THIS TIME- this monitor which will bring those 2 features together (144fps + 167ppi) will truly be an amazing experience when paired with a powerful machine. I personally cannot wait until I get to lay hands on it


i gave 4k @27" a shot a few months ago but i didnt keep it long enough as i thought it was hard to see things. Samsung u28. I recently bought a 4k this time to keep since these wont be out for a while and im using to review performance. picked up an acer S7 as it was the cheapest (microcenter had for 339) and ips. i love it and now have a different opinion about 4k at 27 i think its great although 32 would be ideal i dont mind 27 and i look forward to when these actually drop. half of games i play AA is not needed with this size monitor and FPS goes higher. only negative i have is that with these two Xp's im getting way too much FPS and tearing so i wish these would come already lol. destiny 2 was getting over 100 on ultra pretty amazing


----------



## mmms

https://www.inet.se/produkt/2210502/acer-27-predator-x27-4k-144hz-hdr-g-sync-quantum-dot


----------



## FearlessBelgian

Quote:


> Originally Posted by *mmms*
> 
> https://www.inet.se/produkt/2210502/acer-27-predator-x27-4k-144hz-hdr-g-sync-quantum-dot


2500€ or 3000$...

Not for me...


----------



## MightEMatt

Quote:


> Originally Posted by *mmms*
> 
> https://www.inet.se/produkt/2210502/acer-27-predator-x27-4k-144hz-hdr-g-sync-quantum-dot


Jesus, I hope that price doesn't stick. I plan on getting one, but not if it's going to cost me 3 grand.


----------



## mmms

Quote:


> Originally Posted by *FearlessBelgian*
> 
> 2500€ or 3000$...
> 
> Not for me...


And me lol


----------



## l88bastar

Quote:


> Originally Posted by *mmms*
> 
> And me lol


Three large..... is that like the new magic number now? Three grand for the Titan V and now this...... no thanks Jeff.....


----------



## Clukos

3k for an IPS monitor, let that sink in.


----------



## guttheslayer

Quote:


> Originally Posted by *Clukos*
> 
> 3k for an IPS monitor, let that sink in.


In the meantime the Dell OLED is retailing for the same price lol.

Now I feel that PC industry with the brand "Gaming" is only for the filthy rich.


----------



## CallsignVega

Quote:


> Originally Posted by *Clukos*
> 
> 3k for an IPS monitor, let that sink in.


And a small one at that lol. Notice the date, all the way out till April.


----------



## keikei

Quote:


> Originally Posted by *mmms*
> 
> https://www.inet.se/produkt/2210502/acer-27-predator-x27-4k-144hz-hdr-g-sync-quantum-dot


Anyone catch a release date? High price is to be expected, so unless you got the $$, the price will drop...eventually. Im just glad these high res / high hz are coming out *soon.


----------



## Vipu

Well if OLED cant get to high refreshes guess we gotta get used to IPS?


----------



## rvectors

Who'd of thought you could flog last century tech, for 3 grand. At least if they bonded it to a lovely glass panel, I might even lineup with the rest of the mugs but sadly it'll be the image busting grainy muck that took over the pc monitor market.

'


----------



## boredgunner

Quote:


> Originally Posted by *Vipu*
> 
> Well if OLED cant get to high refreshes guess we gotta get used to IPS?


I have read from various sources that OLED is capable of higher refresh rates than LCD, so I don't think that is a problem for OLED.


----------



## mmms

Ugh....

Based on this, how much do we reckon the Acer / Asus new 1440p Ultrawides with the same technology will cost? (think they are 200hz though).

These prices are ridiculous


----------



## d3v0

no wonder XB271HU prices had been dropping (got mine for $599). The X27 is sort of what all gamers had been waiting for.


----------



## animeowns

Quote:


> Originally Posted by *FearlessBelgian*
> 
> 2500€ or 3000$...
> 
> Not for me...


If it cost that much I can see myself buying the asus 4k 144hz as long as it does not have any qc issues and it has a 3 year warranty or more. I will most likely keep the panel beyond 3 years and extend the warranty at least until we get 8k playable 120hz displays are out in the market I went from 4k 5k to 8k and the difference from 4k to 5k its decent but its not that noticeable from 5k to 8k I mean the pixel density is already so high at 4k that there really is no point in going any higher than 4k unless you can get gsync with a high refresh rate at 5k and 8k and the graphics horsepower to back it


----------



## d3v0

I mean, I just upscale my stuff to 4k on my XB271HU and I foresee no reason ever to need a 4k monitor.


----------



## keikei

Quote:


> Originally Posted by *d3v0*
> 
> I mean, I just upscale my stuff to 4k on my XB271HU and I foresee no reason ever to need a 4k monitor.


Is upscaled 4k and native 4k the same thing?


----------



## animeowns

Quote:


> Originally Posted by *keikei*
> 
> Is upscaled 4k and native 4k the same thing?


no there is a visual difference that upscaled can't match having played around with 4k 5k and 8k native displays the difference is there but 5k and 8k won't be worth it anytime soon unless we get gsync with low input lag

alot have people have been holding off on buying 4k because it has been locked at 60hz there is no doubt that these 4k 144hz and 3440x1440 @ 200hz panels will sell


----------



## Shaded War

I wouldn't put too much thought into that price. It's probably just a place holder price that could go down. Also, People in Europe and the surrounding area pay significantly higher prices than people in North America do for electronics.

Quote:


> Originally Posted by *d3v0*
> 
> I mean, I just upscale my stuff to 4k on my XB271HU and I foresee no reason ever to need a 4k monitor.


It's not the same thing. You get better anti aliasing and some other minor improvements, but your not getting the real experience of having more pixels. That's like saying there's no reason to upgrade from 720p to 1440p because you can just upscale it 4x to have the same amount of pixels rendered.

There's nothing wrong with 1440p, but some people are ready to move on.


----------



## d3v0

Quote:


> Originally Posted by *Shaded War*
> 
> I wouldn't put too much thought into that price. It's probably just a place holder price that could go down. Also, People in Europe and the surrounding area pay significantly higher prices than people in North America do for electronics.
> It's not the same thing. You get better anti aliasing and some other minor improvements, but your not getting the real experience of having more pixels. That's like saying there's no reason to upgrade from 720p to 1440p because you can just upscale it 4x to have the same amount of pixels rendered.
> 
> There's nothing wrong with 1440p, but some people are ready to move on.


Oh you make perfectly valid points and I do not disagree. For those of us who are maybe space limited to a 27" monitor though, 1440p is a sweet spot and 4K is extremely hard to read at 27." for my usage, merely upscaling for that bump in graphics (Dark Souls 3/Witcher 3 with DSR, or Destiny 2 with the in-game super sampling) is a great way to get a boost on graphics when your system can max out 1440p games.


----------



## FearlessBelgian

Quote:


> Originally Posted by *d3v0*
> 
> no wonder XB271HU prices had been dropping (got mine for $599). The X27 is sort of what all gamers had been waiting for.


No because it is IPS... We want Oled...

Who want 1000:1 in contrast ratio in non-hdr content in 2017 (99% of the content is non-hdr today and for the next 2-3 year easily) ?


----------



## superstition222

Quote:


> Originally Posted by *FearlessBelgian*
> 
> No because it is IPS... We want Oled...
> 
> Who want 1000:1 in contrast ratio in non-hdr content in 2017 (99% of the content is non-hdr today and for the next 2-3 year easily) ?


As long as none of your games will cause burn-in from their UI elements.

It's going to be very interesting to see if OLED can manage 8K without serious pixel lifespan problems in smaller sizes, too.


----------



## boredgunner

Quote:


> Originally Posted by *superstition222*
> 
> As long as none of your games will cause burn-in from their UI elements.
> 
> It's going to be very interesting to see if OLED can manage 8K without serious pixel lifespan problems in smaller sizes, too.


The countermeasures to image retention are effective for real world use for gamers. I know someone who has had an LG C6 for over a year, uses it primarily as a computer monitor for his gaming PC, and he games a ton. Never had any retention. Similar results can be found all over avsforum for 2016 LG OLED owners.


----------



## superstition222

Quote:


> Originally Posted by *boredgunner*
> 
> The countermeasures to image retention are effective for real world use for gamers. I know someone who has had an LG C6 for over a year, uses it primarily as a computer monitor for his gaming PC, and he games a ton. Never had any retention. Similar results can be found all over avsforum for 2016 LG OLED owners.


Even if that's true, retention isn't exactly the same thing is pixel intensity loss, although it can be related. OLED has a fundamental problem with blue subpixels. The work-around has been to introduce white subpixels to produce most of the brightness that blue pixels used to be used for. But, as pixels shrink it is questionable how much the white subpixel addition can stop the dimming issue.

Pixels are being shrunk while, simultaneously, new standards are being created and pushed that increase the brightness demands of each pixel.

One person is also anecdotal. He may not be playing games that will cause burn-in or retention problems. I have played games that have very static UI and I wonder about those.

Also, how well the work-arounds (like whole screen dimming with lots of white patches) will help, without impeding the user experience quality, is also a question mark. Constantly changing brightness when using a monitor for desktop-type work can be annoying - as can ashen grey being displayed instead of white.


----------



## boredgunner

Quote:


> Originally Posted by *superstition222*
> 
> Even if that's true, retention isn't exactly the same thing is pixel intensity loss, although it can be related. OLED has a fundamental problem with blue subpixels. The work-around has been to introduce white subpixels to produce most of the brightness that blue pixels used to be used for. But, as pixels shrink it is questionable how much the white subpixel addition can stop the dimming issue.
> 
> Pixels are being shrunk while, simultaneously, new standards are being created and pushed that increase the brightness demands of each pixel.
> 
> One person is also anecdotal. He may not be playing games that will cause burn-in or retention problems. I have played games that have very static UI and I wonder about those.
> 
> Also, how well the work-arounds (like whole screen dimming with lots of white patches) will help, without impeding the user experience quality, is also a question mark. Constantly changing brightness when using a monitor for desktop-type work can be annoying - as can ashen grey being displayed instead of white.


The friend I mention plays strategy games that have more static HUD elements than any action game (Total War and XCOM franchises). And also isometric 2.5D RPGs. And basically every mainstream title as well.

I'm not too concerned about the issues you mention though since we also have microLED in the future, and I don't plan to ever go beyond 5k resolution for a gaming monitor. At 5k I want to stop upgrading resolution (unneeded for gaming on a 40-43" screen or below in my opinion) and just increase refresh rate from there to insanely high levels.


----------



## superstition222

Quote:


> Originally Posted by *boredgunner*
> 
> The friend I mention plays strategy games that have more static HUD elements than any action game (Total War and XCOM franchises). And also isometric 2.5D RPGs. And basically every mainstream title as well.
> 
> I'm not too concerned about the issues you mention though since we also have microLED in the future, and I don't plan to ever go beyond 5k resolution for a gaming monitor. At 5k I want to stop upgrading resolution (unneeded for gaming on a 40-43" screen or below in my opinion) and just increase refresh rate from there to insanely high levels.


LED is backlighting. I assume that will be used with some type of LCD like quantum dot.

Those games may have been designed to avoid burn-in. But, there is the issue of HDR in particular requiring very high levels of brightness from OLED pixels. Retention and burn-in are only one aspect of the issue. The other issue is creeping loss of dynamic range, color range, max brightness, etc.

I have read plenty of reports about retention issues with LG OLED TVs, although the problem may have been more of an issue for 2016 sets vs. 2017. But, those are large sizes. Cram 4K into a 27" panel and you may have to use smaller pixels.

Like it or not, 8K propaganda has already been moving from industry sources for years now. It's on its way and it will be established as the new must-have thing, like 4K was. Very high brightness for HDR implementations is also on its way. Advertisers in particular can't wait to sear people's retinas with blasts of bright flashes.


----------



## l88bastar

8k is awesome.....because it means 4k120 should be available









I don't know how people say 1440p is a sweet spot, I always see the jaggies with it and AA only makes the picture blurrier.

4k is where my eyes become less annoyed with pixel jaggies and is the sweet spot for me....but I prefer a 32" not 27"


----------



## aznsniper911

Quote:


> Originally Posted by *boredgunner*
> 
> The countermeasures to image retention are effective for real world use for gamers. I know someone who has had an LG C6 for over a year, uses it primarily as a computer monitor for his gaming PC, and he games a ton. Never had any retention. Similar results can be found all over avsforum for 2016 LG OLED owners.


Read the 2017 LG threads on AVS, a lot more burn in and image retention issues. Looks like they tweak the counter measures for the 2017 to cause this.


----------



## Vipu

Quote:


> Originally Posted by *boredgunner*
> 
> I have read from various sources that OLED is capable of higher refresh rates than LCD, so I don't think that is a problem for OLED.


Why cant we have single high ref OLED panel then if its possible.


----------



## subtec

Quote:


> Originally Posted by *superstition222*
> 
> LED is backlighting. I assume that will be used with some type of LCD like quantum dot.


You assume wrong. _Micro_LED is a still in-development technology that's one LED per subpixel, just like OLED. You can think of it as inorganic LED to the organic LED of OLED. Like OLED, it's totally independent of LCD.


----------



## boredgunner

Quote:


> Originally Posted by *Vipu*
> 
> Why cant we have single high ref OLED panel then if its possible.


- We do have OLED monitors, just not in the consumer market barring that one now discontinued Dell.
- Too expensive for consumers still.
- The monitor market is easy to milk. People will pay up to $2,000 for IPS monitors, so releasing OLED into such a market is not very appealing to manufacturers.


----------



## l88bastar

Quote:


> Originally Posted by *boredgunner*
> 
> - We do have OLED monitors, just not in the consumer market barring that one now discontinued Dell.
> - Too expensive for consumers still.
> - The monitor market is easy to milk. People will pay up to $2,000 for IPS monitors, so releasing OLED into such a market is not very appealing to manufacturers.


But don't they have eyes? Don't their eyes bleed from craptastic IPS, VA & TN tech like ours? Don't they have children? Don't their children have eyes? Think of the children!


----------



## ToTheSun!

Quote:


> Originally Posted by *l88bastar*
> 
> Don't they have children? Don't their children have eyes? Think of the children!


Checkmate, display industry.


----------



## mmms

Quote:


> Originally Posted by *subtec*
> 
> You assume wrong. _Micro_LED is a still in-development technology that's one LED per subpixel, just like OLED. You can think of it as inorganic LED to the organic LED of OLED. Like OLED, it's totally independent of LCD.


Yes , for me i love the perfect black levels for OLED and the PQ is an excellent but Burn in and image retention makes me skip it and wait for Micro-Led whether for TVs or PC Monitors .
I don't need to be happy the first 2 years with Oled and by the time begin to face problems with it . I can't force anyone in my family to run a particular thing or for a certain period of time in order to maintain OLED and avoid its problems . So i doubt OLED will last for 5 years without any problems and the best solution is waiting for Micro-Led .


----------



## tekjunkie28

I had a 27" 4k ips color calibrated monitor. Great screen but that's where the good things stopped. 4k actually looks worse if completely unnoticeable in games and that's if the game even supports it. The scaling on windows is hit and miss and if it can't scale its completely useless. No way I can read that and I asked my friend who actually had better then normal vision and he can't use it. 1440p is the sweet spot for 27". I wouldn't use anything under 32" that was 4k.

Interesting fact though is I didn't have any issues of maintaining 60+ fps in any game as long and AA was off or lowered. AA isn't really needed at that resolution unless at certain titles. System specs are evga 1070 and a 4670k overclocked to 4.4.

Sent from my SM-N950U using Tapatalk


----------



## boredgunner

Quote:


> Originally Posted by *tekjunkie28*
> 
> I had a 27" 4k ips color calibrated monitor. Great screen but that's where the good things stopped. 4k actually looks worse if completely unnoticeable in games and that's if the game even supports it. The scaling on windows is hit and miss and if it can't scale its completely useless. No way I can read that and I asked my friend who actually had better then normal vision and he can't use it. 1440p is the sweet spot for 27". I wouldn't use anything under 32" that was 4k.
> 
> Interesting fact though is I didn't have any issues of maintaining 60+ fps in any game as long and AA was off or lowered. AA isn't really needed at that resolution unless at certain titles. System specs are evga 1070 and a 4670k overclocked to 4.4.
> 
> Sent from my SM-N950U using Tapatalk


4k looks worse? I suppose what you really mean is, the poor HUD scaling in a lot of games is not worth the visual fidelity improvement, because in terms of graphics higher resolution is always better (until your eyes simply can't notice more but 27" 4k isn't at that point yet unless you sit really far). I have used 27" 4k as well, I prefer a larger screen but 1440p is only a good resolution for VR headsets maybe. It is clearly a low resolution on a 27" monitor, probably any monitor.


----------



## tekjunkie28

Quote:


> Originally Posted by *boredgunner*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tekjunkie28*
> 
> I had a 27" 4k ips color calibrated monitor. Great screen but that's where the good things stopped. 4k actually looks worse if completely unnoticeable in games and that's if the game even supports it. The scaling on windows is hit and miss and if it can't scale its completely useless. No way I can read that and I asked my friend who actually had better then normal vision and he can't use it. 1440p is the sweet spot for 27". I wouldn't use anything under 32" that was 4k.
> 
> Interesting fact though is I didn't have any issues of maintaining 60+ fps in any game as long and AA was off or lowered. AA isn't really needed at that resolution unless at certain titles. System specs are evga 1070 and a 4670k overclocked to 4.4.
> 
> Sent from my SM-N950U using Tapatalk
> 
> 
> 
> 4k looks worse? I suppose what you really mean is, the poor HUD scaling in a lot of games is not worth the visual fidelity improvement, because in terms of graphics higher resolution is always better (until your eyes simply can't notice more but 27" 4k isn't at that point yet unless you sit really far). I have used 27" 4k as well, I prefer a larger screen but 1440p is only a good resolution for VR headsets maybe. It is clearly a low resolution on a 27" monitor, probably any monitor.
Click to expand...

Yes it looked worse. HUD scaling was part of it but I can work around it. Mostly everything was just too small aside from text (text actually looked better in most cases).

It's like the games just stopped looking better after 1440p. I saw no difference between 1440p and 4k in 95% of my games and I'm assuming bc games aren't "graphically optimized" for 4k yet. It's kinda hard to to explain though so milage is gonna vary.

Sent from my SM-N950U using Tapatalk


----------



## profundido

Quote:


> Originally Posted by *tekjunkie28*
> 
> I had a 27" 4k ips color calibrated monitor. Great screen but that's where the good things stopped. 4k actually looks worse if completely unnoticeable in games and that's if the game even supports it. The scaling on windows is hit and miss and if it can't scale its completely useless. No way I can read that and I asked my friend who actually had better then normal vision and he can't use it. 1440p is the sweet spot for 27". I wouldn't use anything under 32" that was 4k.
> 
> Interesting fact though is I didn't have any issues of maintaining 60+ fps in any game as long and AA was off or lowered. AA isn't really needed at that resolution unless at certain titles. System specs are evga 1070 and a 4670k overclocked to 4.4.
> 
> Sent from my SM-N950U using Tapatalk


I have a 27" 4k ips too (as well as 27" 1440p) and alot of user experience with both. I agree that 1440p 144hz is still the sweet spot both performance wise and scaling wise. We don't have the horsepower yet in our current and next-gen videocards to drive the newest hottest titles [email protected] unfortunately. It will take until Ampere in 2029-2020 soonest for that to change.

I don't agree with (modern!) games not looking better. It will indeed of course only show in games that actually use 4K textures to start with but when they do...boy do they look amazing ! (Rise of the Tomb raider and Elder scrolls online come to mind)

Scaling has improved massively in the latest 10 edition compared to the first. It's working perfect when using native Windows software parts that are supported. That means Windows Explorer, Word, Excel, Internet Explorer, ...but as soon as you do an RDP or Citrix session or any other non-native windows component (MMC consoles such as eventviewer) it's all unoptimized scaling and your fonts look like garbage or doubled. In the latest versions of Windows 10 you can now force "no DPI scaling" per program separately so that they always run non-scaled (in native res) while the rest of the OS is still scaled but indeed for most programs that require active reading 27"@4K is not workable even if you can see it perfect. 167ppi is just too much strain on the eyes for continued eyes. It works perfect though for Monitoring tools like "Realtemp", "Afterburner", ...

That being said, purely for gaming I will not hesitate to buy this monitor when it comes out IF and only IF it has low input lag


----------



## tekjunkie28

Quote:


> Originally Posted by *profundido*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tekjunkie28*
> 
> I had a 27" 4k ips color calibrated monitor. Great screen but that's where the good things stopped. 4k actually looks worse if completely unnoticeable in games and that's if the game even supports it. The scaling on windows is hit and miss and if it can't scale its completely useless. No way I can read that and I asked my friend who actually had better then normal vision and he can't use it. 1440p is the sweet spot for 27". I wouldn't use anything under 32" that was 4k.
> 
> Interesting fact though is I didn't have any issues of maintaining 60+ fps in any game as long and AA was off or lowered. AA isn't really needed at that resolution unless at certain titles. System specs are evga 1070 and a 4670k overclocked to 4.4.
> 
> Sent from my SM-N950U using Tapatalk
> 
> 
> 
> I have a 27" 4k ips too (as well as 27" 1440p) and alot of user experience with both. I agree that 1440p 144hz is still the sweet spot both performance wise and scaling wise. We don't have the horsepower yet in our current and next-gen videocards to drive the newest hottest titles [email protected] unfortunately. It will take until Ampere in 2029-2020 soonest for that to change.
> 
> I don't agree with (modern!) games not looking better. It will indeed of course only show in games that actually use 4K textures to start with but when they do...boy do they look amazing ! (Rise of the Tomb raider and Elder scrolls online come to mind)
> 
> Scaling has improved massively in the latest 10 edition compared to the first. It's working perfect when using native Windows software parts that are supported. That means Windows Explorer, Word, Excel, Internet Explorer, ...but as soon as you do an RDP or Citrix session or any other non-native windows component (MMC consoles such as eventviewer) it's all unoptimized scaling and your fonts look like garbage or doubled. In the latest versions of Windows 10 you can now force "no DPI scaling" per program separately so that they always run non-scaled (in native res) while the rest of the OS is still scaled but indeed for most programs that require active reading 27"@4K is not workable even if you can see it perfect. 167ppi is just too much strain on the eyes for continued eyes. It works perfect though for Monitoring tools like "Realtemp", "Afterburner", ...
> 
> That being said, purely for gaming I will not hesitate to buy this monitor when it comes out IF and only IF it has low input lag
Click to expand...

Well the most modern game I have is either endless space 2 or. Civ 6.
I play
Stellaris
Eu4
Endless space 2
Ck2
Cities skylines
Farm simulator

I do play shooters every now and then. I have a lot of there games but that's the ones I typically play right now. The 144hz can really be felt across all games. It's really nice is FPS but also when scrolling and zooming in RTS games. I bet 8k would look good on gta5 put I normally play that on ps4 just because my friends are lame and don't play on pc lol.

BTW not to derail the thread but is wow just badly coded? I can't seem to break 100 fps in it. I don't play it anymore but used it for testing purposes and was surprised by that.

Sent from my SM-N950U using Tapatalk


----------



## sblantipodi

Quote:


> Originally Posted by *Vipu*
> 
> Well if OLED cant get to high refreshes guess we gotta get used to IPS?


OLED is simply not ready for PC monitors.
Phones has just proved it.

They have improved the "burning problem" but this monitors are still affected by burning problems.
If it's true that OLED is good for phones few hours of activity per day, fast replacement who get a smartphone for more than 2 years now?
This is not true for PC monitors, PC users tends to use a PC monitor for more than 2 years, for a lot of hours a day, using the same windows toolbar for hours.
This causes burning effect, OLED monitors looses quality way faster than IPS one.

I don't want OLED for my PC but please stop talking about OLED here, this is another thread.


----------



## FearlessBelgian

I really think Micro-led are the future, not OLED.

Oled have a big problem with burn-in issues and this is inherent to the technology (Organic diodes lose power over time and do not age in the same way...)

Oled are really not for video games, really not...


----------



## Leopardi

Quote:


> Originally Posted by *sblantipodi*
> 
> OLED is simply not ready for PC monitors.
> Phones has just proved it.


Phones aren't even comparable to the LG OLED's of past years when it comes to burn in.


----------



## CallsignVega

Quote:


> Originally Posted by *profundido*
> 
> I have a 27" 4k ips too (as well as 27" 1440p) and alot of user experience with both. I agree that 1440p 144hz is still the sweet spot both performance wise and scaling wise. We don't have the horsepower yet in our current and next-gen videocards to drive the newest hottest titles [email protected] unfortunately. It will take until Ampere in 2029-2020 soonest for that to change.
> 
> I don't agree with (modern!) games not looking better. It will indeed of course only show in games that actually use 4K textures to start with but when they do...boy do they look amazing ! (Rise of the Tomb raider and Elder scrolls online come to mind)
> 
> Scaling has improved massively in the latest 10 edition compared to the first. It's working perfect when using native Windows software parts that are supported. That means Windows Explorer, Word, Excel, Internet Explorer, ...but as soon as you do an RDP or Citrix session or any other non-native windows component (MMC consoles such as eventviewer) it's all unoptimized scaling and your fonts look like garbage or doubled. In the latest versions of Windows 10 you can now force "no DPI scaling" per program separately so that they always run non-scaled (in native res) while the rest of the OS is still scaled but indeed for most programs that require active reading 27"@4K is not workable even if you can see it perfect. 167ppi is just too much strain on the eyes for continued eyes. It works perfect though for Monitoring tools like "Realtemp", "Afterburner", ...
> 
> That being said, purely for gaming I will not hesitate to buy this monitor when it comes out IF and only IF it has low input lag


All G-Sync monitors have low input lag.


----------



## boredgunner

Quote:


> Originally Posted by *tekjunkie28*
> 
> I'm assuming bc games aren't "graphically optimized" for 4k yet.


There's no such thing. Pixels are pixels. The only thing game developers have to adjust for is HUD scaling.

The only factors here are the user's eyesight and their distance from the monitor.
Quote:


> Originally Posted by *sblantipodi*
> 
> OLED is simply not ready for PC monitors.
> Phones has just proved it.
> 
> They have improved the "burning problem" but this monitors are still affected by burning problems.
> If it's true that OLED is good for phones few hours of activity per day, fast replacement who get a smartphone for more than 2 years now?
> This is not true for PC monitors, PC users tends to use a PC monitor for more than 2 years, for a lot of hours a day, using the same windows toolbar for hours.
> This causes burning effect, OLED monitors looses quality way faster than IPS one.
> 
> I don't want OLED for my PC but please stop talking about OLED here, this is another thread.


Basically a strawman argument, which Leopardi pointed out already. OLED in phones (RGB OLED) with their complete lack of anti-retention features AND their planned obsolescence release models hardly has anything to do with LG white OLED TVs which have multiple highly effective, proven features to prevent retention and burn-in. But you will use this strawman argument forever.
Quote:


> Originally Posted by *FearlessBelgian*
> 
> I really think Micro-led are the future, not OLED.
> 
> Oled have a big problem with burn-in issues and this is inherent to the technology (Organic diodes lose power over time and do not age in the same way...)
> 
> Oled are really not for video games, really not...


microLED probably, hopefully is the future, but you are also exaggerating the issue and need to check out LG's 2016 and 2017 models and see what it actually takes to cause retention (people run tests just for this). Extreme use causes retention, and burn-in would never occur for a gamer.

But I'll tell you what's not for video games: LCD. It's also not good for content creation or any color critical work. Good for nothing.


----------



## superstition222

Quote:


> Originally Posted by *boredgunner*
> 
> There's no such thing.


Graphical optimization can mean using low-res models and just upscaling them.

Also burn-in and retention are just one problem. Another is the aging of blue subpixels, causing a loss of color range and/or intensity.
Quote:


> Originally Posted by *subtec*
> 
> You assume wrong. _Micro_LED is a still in-development technology that's one LED per subpixel, just like OLED. You can think of it as inorganic LED to the organic LED of OLED. Like OLED, it's totally independent of LCD.


That's nice. Let me know when it's commercially available. Currently, LED means backlights.

I've been waiting for CPUs with GaInAs with nanowires for years.


----------



## subtec

Quote:


> Originally Posted by *superstition222*
> 
> That's nice. Let me know when it's commercially available. Currently, LED means backlights.


Well, the discussion was about future technologies. Also, OLED is an LED technology (hence the name), and is current, yet doesn't involve backlighting.


----------



## superstition222

Quote:


> Originally Posted by *subtec*
> 
> Well, the discussion was about future technologies. Also, OLED is an LED technology (hence the name), and is current, yet doesn't involve backlighting.


Yes for _O_LED that's true.

OLED isn't just the future tech. Its issues with blue subpixel lifespan have been well-known for many years. That's why the white subpixel was introduced. Complaints about retention have been talked about in terms of 2016 LG televisions in particular.

An inorganic LED without an LCD layer sounds exciting. Let's hope it pans out.


----------



## profundido

Quote:


> Originally Posted by *CallsignVega*
> 
> All G-Sync monitors have low input lag.


I was going to say "not true" but then I realized that without details we're just comparing subjective standards of low input lag that only exist in our minds and are different. When comparing 3 of my screens which all 3 have g-sync, 1 feels superfast, 1 medium but still fast and the last screen feels so slow I can no longer play games on it. In detail:

My *Asus PG278Q* g-sync screens feels superfast like no input delay at all. As fast as I think and execute looking in a different direction with the mouse, the screen shows it ! This keeps amazing me as it gives me the sensation that my mind, arm, mouse, computer and screen are operating as 1 unit, not different parts.

My *Asus PG27AQ* g-sync screen feels slow and sluggish now. It feels like I move the mouse and the screen just comes...later ! Being used to the PG278Q I expect the result of my action to follow at the exact same time of action but it doesn't. I have to wait for this delay and it frustrates me constantly. It feels like night and day difference in fast (looking around) movement

My *Viewsonic viewsonic xg2703-gs* g-sync screen comes close to PG278Q but still not quite the same. Feels fast enough in order not to frustrate and playable with all the beautiful color benefits

As a reference standard and to put things into perspective the official vendor-published input lag values as well as the realistically measured "total input lag" by independent reviewers are widely available on the net. I realize that no IPS/VA screen such as the last 2 I mentioned can ever match the low input of the fast TN screen I mentioned first and thus cannot be compared or hoped for in the upcoming *Asus PG27UQ* but the Viewsonic proves me every day that best of both worlds IS possible and that's exactly what I'm hoping for in the upcoming *Asus PG27UQ*. If it's input lag however is gonna be anything like the PG27AQ I won't be buying it for sure.

I know you're very knowledgeable on monitors so maybe you have an idea on what we can expect from this upcoming monitor ?


----------



## CallsignVega

You are comparing 4K @ 60 Hz monitor versus 144 Hz 1440p monitors. Of course the monitor with much lower Hz isn't going to feel as snappy, the frames aren't updated as much which will increase input lag.

Still, measured 5.9 ms input lag is basically nothing for 60 Hz:

https://pcmonitors.info/reviews/asus-pg27aq/


----------



## profundido

Quote:


> Originally Posted by *CallsignVega*
> 
> You are comparing 4K @ 60 Hz monitor versus 144 Hz 1440p monitors. Of course the monitor with much lower Hz isn't going to feel as snappy, the frames aren't updated as much which will increase input lag.
> 
> Still, measured 5.9 ms input lag is basically nothing for 60 Hz:
> 
> https://pcmonitors.info/reviews/asus-pg27aq/


I used to think too that it was just the 60 hz until I decided to set both 144hz screens to 60hz and then compare them. That's when I realized the delay I felt was not the max display frequency


----------



## CallsignVega

It was objectively measured by a quality review site to have low input lag, that is all I can say.


----------



## toncij

This display has been cancelled?


----------



## profundido

Quote:


> Originally Posted by *toncij*
> 
> This display has been cancelled?


Really ?? You got any link on the official announcement ? I must have missed that completely


----------



## profundido

Quote:


> Originally Posted by *tekjunkie28*
> 
> Well the most modern game I have is either endless space 2 or. Civ 6.
> I play
> Stellaris
> Eu4
> Endless space 2
> Ck2
> Cities skylines
> Farm simulator
> 
> I do play shooters every now and then. I have a lot of there games but that's the ones I typically play right now. The 144hz can really be felt across all games. It's really nice is FPS but also when scrolling and zooming in RTS games. I bet 8k would look good on gta5 put I normally play that on ps4 just because my friends are lame and don't play on pc lol.
> 
> BTW not to derail the thread but is wow just badly coded? I can't seem to break 100 fps in it. I don't play it anymore but used it for testing purposes and was surprised by that.
> 
> Sent from my SM-N950U using Tapatalk


I love the 144hz in Starcraft 2 scrolling as well. You're right, much smoother

I downloaded WOW just for testing recently as well since I reworked my machine with the 8700K cpu.

As usual in mmo's where inter-character related graphics displaying typically has to be programmed within 1 single thread because of the dependencies, the fps in towns and people concentrated area's is bottlenecked by the max single core OC on your cpu, not the graphics card. Hence in those area's you won't get even close to 100 while you'll see almost no load on the graphics card during that time.

Outside towns or wherever you're alone it's pure load on the graphics card up to the max frequency supported by your display, but I remember there's an adjustable default set fps limit in the "advanced options" tab in graphics settings (a slider) that you need to up first to be able to test anything higher than 100fps. Hope that answers your question


----------



## sblantipodi

Quote:


> Originally Posted by *Leopardi*
> 
> Phones aren't even comparable to the LG OLED's of past years when it comes to burn in.


OLED isn't ready for PC monitors and burn in problems still there.
As I said, this is a thread of an IPS monitor, OLED is simply not ready.


----------



## FearlessBelgian

Quote:


> Originally Posted by *sblantipodi*
> 
> OLED isn't ready for PC monitors and burn in problems still there.
> As I said, this is a thread of an IPS monitor, OLED is simply not ready.


Not sure if OLED will be ready someday...


----------



## kot0005

65incher..where r the freaking monitors that were announced last year?!? hello ?!?


----------



## MistaSparkul

Quote:


> Originally Posted by *kot0005*
> 
> 
> 
> 
> 
> 
> 65incher..where r the freaking monitors that were announced last year?!? hello ?!?


They'll come out THIS year while these newly announced ones will come out NEXT year


----------



## toncij

Quote:


> Originally Posted by *MistaSparkul*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kot0005*
> 
> 
> 
> 
> 
> 
> 65incher..where r the freaking monitors that were announced last year?!? hello ?!?
> 
> 
> 
> They'll come out THIS year while these newly announced ones will come out NEXT year
Click to expand...

It would not be the first time Asus just gave up on a display. It happened with a 32"er last year. Dell did the same with OLED... Asus is just selling hype, but can't really produce those promised.


----------



## Kommando Kodiak

sick post


----------



## FearlessBelgian

Q3 ? LOL...

3000$ and Q3... What a joke...


----------



## Kommando Kodiak

I'm gonna go ahead and go with April since thats what was cited in the swedish preorder. Q3 is july august september for reference btw


----------



## sblantipodi

Quote:


> Originally Posted by *FearlessBelgian*
> 
> Not sure if OLED will be ready someday...


same here, OLED is good for non durable displays.


----------



## FearlessBelgian

Quote:


> Originally Posted by *Kommando Kodiak*
> 
> I'm gonna go ahead and go with April since thats what was cited in the swedish preorder. Q3 is april may june for reference btw


Q3 is not April.

Q3 is july, august, september.


----------



## toncij

Quote:


> Originally Posted by *Kommando Kodiak*
> 
> http://edgeup.asus.com/2017/rog-pg27uq-intro/ "With a high resolution, high refresh rate, and phenomenal picture, the ROG Swift PG27UQ is versatile enough for gaming, entertainment, and content creation. It's scheduled to arrive in North America in Q3."


Yes, Q3 2017. It's Q2 2018 today and that 2+ quarters or 7 months late with no signs of showing up.


----------



## Kommando Kodiak

kk ill fix it , i blame the cold


----------



## Kommando Kodiak

Quote:


> Originally Posted by *toncij*
> 
> Yes, Q3 2017. It's Q2 2018 today and that 2+ quarters or 7 months late with no signs of showing up.


I swear it said jan 4 2018. I double checked even before I posted. Guys Im sorry 100% blaming this cold.


----------



## Excession

Quote:


> Originally Posted by *kot0005*
> 
> 
> 
> 
> 
> 
> 65incher..where r the freaking monitors that were announced last year?!? hello ?!?


Maybe now the people moaning about 27" being too small for 4K will stop cluttering up the threads for high-DPI displays...


----------



## MistaSparkul

Quote:


> Originally Posted by *Excession*
> 
> Maybe now the people moaning about 27" being too small for 4K will stop cluttering up the threads for high-DPI displays...


Dude its not that we do not want high dpi displays. We want a better balance of size and dpi. Most of us are in agreement that a 32 inch version would be perfect in terms of size and dpi balance. I dont think 140 ppi is too little. Do you?


----------



## kot0005

Quote:


> Originally Posted by *Kommando Kodiak*
> 
> http://edgeup.asus.com/2017/rog-pg27uq-intro/ "With a high resolution, high refresh rate, and phenomenal picture, the ROG Swift PG27UQ is versatile enough for gaming, entertainment, and content creation. It's scheduled to arrive in North America in Q3."


dude..the article was posted in January 2017 lol..

Are these monitors even at CES ??


----------



## Excession

Quote:


> Originally Posted by *MistaSparkul*
> 
> I dont think 140 ppi is too little. Do you?


Yes. The only mass-market displays which have satisfactory PPIs are found on phones. Since monitors are so much lower, I see no reason to settle for small incremental improvements when we need drastic ones. 27" is already a compromise.

I'll probably go for 32" at 8K, though.


----------



## MistaSparkul

Quote:


> Originally Posted by *Excession*
> 
> Yes. The only mass-market displays which have satisfactory PPIs are found on phones. Since monitors are so much lower, I see no reason to settle for small incremental improvements when we need drastic ones. 27" is already a compromise.
> 
> I'll probably go for 32" at 8K, though.


8k at 32 inches is still nowhere near that satisfactory phone ppi. And besides 600 ppi on a pc monitor? Unless you planning to use it 2 inches away from your face thats kinda pointless.


----------



## Malinkadink

Quote:


> Originally Posted by *MistaSparkul*
> 
> 8k at 32 inches is still nowhere near that satisfactory phone ppi. And besides 600 ppi on a pc monitor? Unless you planning to use it 2 inches away from your face thats kinda pointless.


Yeah i'm on 122 ppi at 2 feet distance and im pretty content with this but do want a little more as jaggies can still be seen even with some AA. I think at 150 ppi i'd be good with, could even do with going as high as 200, but beyond that at normal viewing distances relative to the size of the display i couldn't be bothered anymore and would rather they push the hz over more resolution.


----------



## ryan92084

Quote:


> Originally Posted by *Malinkadink*
> 
> Yeah i'm on 122 ppi at 2 feet distance and im pretty content with this but do want a little more as jaggies can still be seen even with some AA. I think at 150 ppi i'd be good with, could even do with going as high as 200, but beyond that at normal viewing distances relative to the size of the display i couldn't be bothered anymore and would rather they push the hz over more resolution.


Yeah, this surface book is 267ppi and even at lap distance it is sometimes a bit much. 150-200ppi is probably a reasonable maximum for a monitor outside of specific use cases.


----------



## Pokiehat

I have a Surface Pro 4 which has 267 ppi too. Text is amazing on it at all sizes. I used to think Windows had inherently bad font rendering until I saw the surface screen. The ppi makes text legible even with no scaling at arm's reach.

Maybe its a bit much but the trend in desktop monitors is more towards making the diagonal larger and sticking to standard resolution and aspect ratio (i.e. 1080p, 1440p, 4k at 16:9). That leaves gaps where the ppi gets so low its retch worthy, i.e. 27" 1080p.

Mobile devices tend to have screens that are comfortable to hold or view at distances within arm's reach so the form factor is way more flexible and more grounded in ergonomics. Surface Pro display does have backlight bleed but mine is fairly minimal. If it was 24" or bigger and available as a desktop display at even half the ppi, it would be the best 60hz monitor I have ever owned.


----------



## toncij

220 PPI here - 5K on 27". And it's fine. Now give me 144Hz here and I'm happy.


----------



## sblantipodi

Quote:


> Originally Posted by *toncij*
> 
> 220 PPI here - 5K on 27". And it's fine. Now give me 144Hz here and I'm happy.


what's the sense of a 5K on a 27 inch?
are there some advantages?


----------



## CallsignVega

I had the 5K 27" Dell. Was a bit of wasted PPI at normal sitting distance. ASUS just announced a 32" 5K monitor, a size more befitting I think.


----------



## MistaSparkul

Quote:


> Originally Posted by *CallsignVega*
> 
> I had the 5K 27" Dell. Was a bit of wasted PPI at normal sitting distance. ASUS just announced a 32" 5K monitor, a size more befitting I think.


Link? I'm definitely interested in a 32 inch 5k


----------



## CallsignVega

Quote:


> Originally Posted by *MistaSparkul*
> 
> Link? I'm definitely interested in a 32 inch 5k


Woops never mind, the guy in the video said the size wrong. It's the same old 27" 5K panel.


----------



## Sancus

Did Asus/Acer/Nvidia forget about these monitors? They're only talking about their silly 65" thing at CES. Are they going to slip again, or just cancel altogether? Pretty disappointing.


----------



## ESRCJ

Quote:


> Originally Posted by *Sancus*
> 
> Did Asus/Acer/Nvidia forget about these monitors? They're only talking about their silly 65" thing at CES. Are they going to slip again, or just cancel altogether? Pretty disappointing.


It's extremely disappointing. I've been stuck on 1440p since 2012 (16:9 from 2012-2016 and 21:9 2016-present) and I'm ready for 4K with high refresh rates. First they tease us a year ago, then we get that terrible delay announced in August for Q1 2018, and now here we are at CES 2018 with absolutely no mention of those displays... I have tech blue ball at this point.


----------



## toncij

They've probably failed at getting good yields at those panels so will silently cancel them like Dell did with an OLED.


----------



## Kommando Kodiak

These guys https://pcgamesn.com/nvidia-gsync-hdr-release-date said they saw them at the Asus booth or w/e, So I asked several personalities at ces to confirm it and got completely ignored, maybe theyve got some announcement planned for today instead or something who knows.


----------



## kot0005

Quote:


> Originally Posted by *Kommando Kodiak*
> 
> These guys https://pcgamesn.com/nvidia-gsync-hdr-release-date said they saw them at the Asus booth or w/e, So I asked several personalities at ces to confirm it and got completely ignored, maybe theyve got some announcement planned for today instead or something who knows.


prob click baiting..we already know they were supposed to be out in Q1 and Q2 not even new news.


----------



## CallsignVega

Quote:


> Originally Posted by *kot0005*
> 
> prob click baiting..we already know they were supposed to be out in Q1 and Q2 not even new news.


Wel if true it is good to have recent confirmation. These type of displays have crazy long development timelines or silently get cancelled.


----------



## Nammi

The acer one is now listed on 2 sites here in Sweden. Not looking good price wise though...
https://www.prisjakt.nu/produkt.php?p=4642799


----------



## kot0005

Quote:


> Originally Posted by *Nammi*
> 
> The acer one is now listed on 2 sites here in Sweden. Not looking good price wise though...
> https://www.prisjakt.nu/produkt.php?p=4642799


acer has 2 versions of the 27inch. X27 will be expensive because it has eye tracking tech and other stuff.

Just but 1 Ethereum. By the time the monitor is out, you can prob buy 3.


----------



## Nammi

Quote:


> Originally Posted by *kot0005*
> 
> acer has 2 versions of the 27inch. X27 will be expensive because it has eye tracking tech and other stuff.
> 
> Just but 1 Ethereum. By the time the monitor is out, you can prob buy 3.


Ooh, wasn't aware of the other version. It's time I did my eyes a favor after using strobed for years... 25k SEK is ~$3k, while not out of reach I'd certainly prefer the stripped of "features" one.


----------



## Kommando Kodiak

Remember everything in europe will be more expensive because of higher taxes(for business and individuals like enviroment/disposal fees) and then the conversion rates and the VAT tacked on as well. Stateside maximum price of $2K is my guess but im really holding out for $1500

clarification: Oh yeah and the x27 will be more expensive than the asus* cause of the eye tracker and flaps . I my self am hoping the Asus will be $1500


----------



## toncij

Quote:


> Originally Posted by *kot0005*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nammi*
> 
> The acer one is now listed on 2 sites here in Sweden. Not looking good price wise though...
> https://www.prisjakt.nu/produkt.php?p=4642799
> 
> 
> 
> acer has 2 versions of the 27inch. X27 will be expensive because it has eye tracking tech and other stuff.
> 
> Just but 1 Ethereum. By the time the monitor is out, you can prob buy 3.
Click to expand...

If Eth doesn't drop like btc did







Then you can buy 75% of one.


----------



## Mrip541

Monitors that offer anything new have universally been vaporware for 2 years. I'll believe it when I see it.


----------



## Kommando Kodiak

Quote:


> Originally Posted by *Glerox*
> 
> For those wondering if Asus cancelled their PG27UQ because no one speaks about it at CES 2018, I just spotted it in a video of Asus's booth (in the background on the right)
> 
> 
> 
> There is still hope to see this monitor one day


----------



## wrath663

The YouTube channel OC3D actually checked up on the monitors, they are at the show but no new info.

Starts around the 12 min mark


----------



## CallsignVega

Quote:


> Originally Posted by *wrath663*
> 
> The YouTube channel OC3D actually checked up on the monitors, they are at the show but no new info.
> 
> Starts around the 12 min mark


35" VA, so sounds like the same panel all the other 35" VA are using. Kinda significant smear on that panel, I think the 200 Hz is a gimmick like 200 Hz was on the Acer Z35.


----------



## Kommando Kodiak

You guy wanna talk planned setups involving the monitor I wanna gush but only if you guys care to hear it, I dont want it to be a bother


----------



## animeowns

Quote:


> Originally Posted by *l88bastar*
> 
> 8k is awesome.....because it means 4k120 should be available
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't know how people say 1440p is a sweet spot, I always see the jaggies with it and AA only makes the picture blurrier.
> 
> 4k is where my eyes become less annoyed with pixel jaggies and is the sweet spot for me....but I prefer a 32" not 27"


not exactly the dell 8k panel I used had 2 displayport 1.4 ports and it never went over 60hz and you are not even able to set a custom resolution when using lower resolutions like 4k on it


----------



## animeowns

Quote:


> Originally Posted by *CallsignVega*
> 
> 35" VA, so sounds like the same panel all the other 35" VA are using. Kinda significant smear on that panel, I think the 200 Hz is a gimmick like 200 Hz was on the Acer Z35.


its not a gimmick there is a difference in the smoothness motion going from 144 to 200hz but I would say best to wait to see if it has any QC issues before making a buy me personally I'm waiting on the pg65 or the asus pg27uq to replace my 240hz


----------



## Sancus

Quote:


> Originally Posted by *animeowns*
> 
> its not a gimmick there is a difference in the smoothness motion going from 144 to 200hz but I would say best to wait to see if it has any QC issues before making a buy me personally I'm waiting on the pg65 or the asus pg27uq to replace my 240hz


This isn't true, the Z35 actually LOOKED WORSE at 200hz than 144hz, because of distracting overshoot and ghosting. When your pixels don't transition fast enough for the refresh rate, you end up forcing pixels to transition a 2nd time before they've even finished the first one, and it makes a huge mess.

That said, I think 3440x1440 @ 144hz with HDR is still great. If they can make marginal improvements and get the PG35VQ panel to acceptable at 144hz instead of the 100hz of the older 35" 21:9 screens, I think that would still make it a great display.


----------



## Malinkadink

Quote:


> Originally Posted by *Sancus*
> 
> This isn't true, the Z35 actually LOOKED WORSE at 200hz than 144hz, because of distracting overshoot and ghosting. When your pixels don't transition fast enough for the refresh rate, you end up forcing pixels to transition a 2nd time before they've even finished the first one, and it makes a huge mess.
> 
> That said, I think 3440x1440 @ 144hz with HDR is still great. If they can make marginal improvements and get the PG35VQ panel to acceptable at 144hz instead of the 100hz of the older 35" 21:9 screens, I think that would still make it a great display.


Why settle for 3440x1440 when you can squeeze a bit more pixels out and still get 21:9. DP 1.4 is capable of running 3840x1600 144hz and you can increase the monitor size from 34 to 38 inches too and still keep the same 110~ pixel density.


----------



## CallsignVega

Quote:


> Originally Posted by *Malinkadink*
> 
> Why settle for 3440x1440 when you can squeeze a bit more pixels out and still get 21:9. DP 1.4 is capable of running 3840x1600 144hz and you can increase the monitor size from 34 to 38 inches too and still keep the same 110~ pixel density.


Totally agree. 3440x1440 @ 34/35 inch is "meh".


----------



## animeowns

Quote:


> Originally Posted by *Sancus*
> 
> This isn't true, the Z35 actually LOOKED WORSE at 200hz than 144hz, because of distracting overshoot and ghosting. When your pixels don't transition fast enough for the refresh rate, you end up forcing pixels to transition a 2nd time before they've even finished the first one, and it makes a huge mess.
> 
> That said, I think 3440x1440 @ 144hz with HDR is still great. If they can make marginal improvements and get the PG35VQ panel to acceptable at 144hz instead of the 100hz of the older 35" 21:9 screens, I think that would still make it a great display.


oh No i meant just the refresh rate in general not that one display at least on my 240hz display I can notice a difference in comparison to the pg279q when testing 144hz vs 200hz on the 1920x1080 240hz p anels you can use it at 200hz those 1080p panels


----------



## Sancus

Quote:


> Originally Posted by *animeowns*
> 
> oh No i meant just the refresh rate in general not that one display


Right but Vega was saying 200hz is a gimmick on these VA panel monitors specifically, because of their response time issues. In general, yes, 200hz+ is great, but not with slow VA panels and we've yet to see an Ultrawide VA panel fast enough for 144hz let alone 200hz.


----------



## Zenairis

As I've stated in some other threads Acer literally just updated the X27 page yesterday. I'd imagine it's closer to release now.


----------



## HiBillyMaysHere

Any ideas or guesses on price?


----------



## FearlessBelgian

Quote:


> Originally Posted by *HiBillyMaysHere*
> 
> Any ideas or guesses on price?


Minimum 2000€.


----------



## Zenairis

I'll warn you before you look at this you don't want to see the price and this is the X27 it's brother monitor by Acer.

https://www.komplett.no/product/978567/gaming/skjerm-tilbehoer/skjermer/acer-27-predator-4k-led-g-sync-x27#

Current sources are pointing to an early April release.

I'll just put it this way it's looking to be a $2500~3000 USD Monitor.


----------



## FearlessBelgian

Quote:


> Originally Posted by *Zenairis*
> 
> I'll warn you before you look at this you don't want to see the price and this is the X27 it's brother monitor by Acer.
> 
> https://www.komplett.no/product/978567/gaming/skjerm-tilbehoer/skjermer/acer-27-predator-4k-led-g-sync-x27#
> 
> Current sources are pointing to an early April release.
> 
> I'll just put it this way it's looking to be a $2500~3000 USD Monitor.


Ouch...

2584€...


----------



## guttheslayer

Quote:


> Originally Posted by *CallsignVega*
> 
> Totally agree. 3440x1440 @ 34/35 inch is "meh".


I rather they keep the size screen and go for denser PPI for 3840x1600.

Then again one can only hope.


----------



## CallsignVega

Quote:


> Originally Posted by *guttheslayer*
> 
> I rather they keep the size screen and go for denser PPI for 3840x1600.
> 
> Then again one can only hope.


Yup, that would be ideal. 3440x1440 34" is just a widened 27" 1440p monitor.

https://www.elgiganten.se/product/datorer-tillbehor/bildskarm/ACPREDX27/acer-predator-x27-27-4k-uhd-bildskarm-gaming

Ya, kinda strange for these retailers to list something not releasing for 4 months eh.


----------



## Malinkadink

Quote:


> Originally Posted by *CallsignVega*
> 
> Yup, that would be ideal. 3440x1440 34" is just a widened 27" 1440p monitor.
> 
> https://www.elgiganten.se/product/datorer-tillbehor/bildskarm/ACPREDX27/acer-predator-x27-27-4k-uhd-bildskarm-gaming
> 
> Ya, kinda strange for these retailers to list something not releasing for 4 months eh.


I don't think its strange at all, they're testing the waters so to speak to see just how much interest there is in these monitors at the price they're currently suggesting. If they're content with the amount of preorders they receive then they know people are willing to pay what they're asking, otherwise they'll adjust the price accordingly closer to release.


----------



## animeowns

Quote:


> Originally Posted by *CallsignVega*
> 
> Yup, that would be ideal. 3440x1440 34" is just a widened 27" 1440p monitor.
> 
> https://www.elgiganten.se/product/datorer-tillbehor/bildskarm/ACPREDX27/acer-predator-x27-27-4k-uhd-bildskarm-gaming
> 
> Ya, kinda strange for these retailers to list something not releasing for 4 months eh.


so thats like $3000 after converted from swedish to usd ?


----------



## mmms




----------



## CallsignVega

Quote:


> Originally Posted by *mmms*


YES! Looks like it is using a semi-gloss coating instead of that matte AR crap Asus loves.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Zenairis*
> 
> I'll warn you before you look at this you don't want to see the price and this is the X27 it's brother monitor by Acer.
> 
> https://www.komplett.no/product/978567/gaming/skjerm-tilbehoer/skjermer/acer-27-predator-4k-led-g-sync-x27#
> 
> Current sources are pointing to an early April release.
> 
> I'll just put it this way it's looking to be a $2500~3000 USD Monitor.


That must be a place holder price, it'll be expensive but upwards of $3000 for an IPS monitor... that would mean OLED monitors would be $10,000 for the same spec, very few people could afford them.

Edit, the most expensive high refresh rate monitor on Newegg is the Samsung C49HG90 at $1500 USD, these ones will probably come in around $1500 - $2200 ish, anything more would just be stupid, we are talking about AU Optronics here, they have a pretty crappy rep.


----------



## Malinkadink

Quote:


> Originally Posted by *CallsignVega*
> 
> YES! Looks like it is using a semi-gloss coating instead of that matte AR crap Asus loves.


No its matte, probably same type of matte used on the 1440p 144hz AHVA monitors. At that angle it will appear semi glossy, even my Dell S2417DG looks semi gloss at that kind of angle.


----------



## CallsignVega

Quote:


> Originally Posted by *Malinkadink*
> 
> No its matte, probably same type of matte used on the 1440p 144hz AHVA monitors. At that angle it will appear semi glossy, even my Dell S2417DG looks semi gloss at that kind of angle.


No what I am referring to is it doesn't have that garbage super matte sparkle crap like on old IPS panels and modern TN panels. The Dell S2417DG which I've had doesn't have it.


----------



## MistaSparkul

Quote:


> Originally Posted by *CallsignVega*
> 
> No what I am referring to is it doesn't have that garbage super matte sparkle crap like on old IPS panels and modern TN panels. The Dell S2417DG which I've had doesn't have it.


Wouldn't that still make it matte and not semi glossy? The only real semi glossy monitor I can think of is the Eizo FG2421, the Dell S2417DG is still matte, just not "as matte" as the PG278QR.


----------



## CallsignVega

There are dozens of steps of AR film, so they cannot all fit into three categories. And there are no objective tests of this monitor to say it has the same AR film as a Dell S24217DG. The main point being the sparkle matte garbage AR film would not have a side-view reflectivity as seen in the video. That is the main point.


----------



## Pokiehat

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> these ones will probably come in around $1500 - $2200 ish, anything more would just be stupid, we are talking about AU Optronics here, they have a pretty crappy rep.


AUO doesn't have a crappy rep though. They are a global top 3 TFT LCD manufacturer. They dominate small form factor consumer TFT LCD displays, a market that the S Korean giants LG and Samsung seem unable or unwilling to compete in. Let that sink in for a moment because AUO was a late entrant to large scale TFT LCD manufacturing. Its incredible they weren't acquired or simply crushed by the competition. Remember the LG 32GK850G-B has a high refresh rate VA type panel manufactured by AUO and it has none of the problems associated with M270. Arguably LG's best gaming monitor has a panel designed and manufactured by AUO.

Its the M270 panel that seems to have problems. I think this panel is just difficult to manufacture with low yield rates. It is not sold in large quantities to big OEM service providers but rather small quantities to gaming brand customers so the economies of scale are not there.

These gaming brand customers in turn cater to a demographic of overwhelmingly young people who do not use the product in a professional capacity and are extremely averse to spending money. It is unlike professional demographics. Those guys have no problem dropping 10K+ per 23" Sony reference monitor. So you combine all of this with nVidia's gsync tax and it feels like M270 price per unit has to soar or quality standards must be relaxed. Over 2 years it seems like both are happening.


----------



## Morkai

Quote:


> Originally Posted by *Pokiehat*
> 
> Its incredible they weren't acquired or simply crushed by the competition.


This can be explained by the fact that AUO was formed by merging acer display and Unipac Optoelectronics Corporation, meaning acer was a major owner and probably wanted to take the path they did and not sell. Acer display was formed in the 90's so they were not exactly a new player on the display market, but more or less veterans and backed by acer and others.
(the name is Acer Unipac Optoelectronics although they only use the abbreviation).

Back in these days acer was known for making cheap junk mainly, which might be why they let AUO be disassociated from the acer name.

These days they at least have a reputation for the best response times for lcd by far, even though the quality otherwise isn't top notch, but it is a tradeoff i'd take every time.


----------



## ToTheSun!

Quote:


> Originally Posted by *MistaSparkul*
> 
> Wouldn't that still make it matte and not semi glossy? The only real semi glossy monitor I can think of is the Eizo FG2421, the Dell S2417DG is still matte, just not "as matte" as the PG278QR.


The coating on the FG2421 was supreme. I still prefer TV style semi-glossy, but that thing was much better than any other matte monitor around today.


----------



## MistaSparkul

Quote:


> Originally Posted by *ToTheSun!*
> 
> The coating on the FG2421 was supreme. I still prefer TV style semi-glossy, but that thing was much better than any other matte monitor around today.


Yeah I wish display makers would either stop using such heavy coatings or at least give us a choice for a glossy option.


----------



## Morkai

Quote:


> Originally Posted by *MistaSparkul*
> 
> Yeah I wish display makers would either stop using such heavy coatings or at least give us a choice for a glossy option.


Yeah the matte dominance is so weird. I completely get it for offices or public spaces with daylight leaking in, but for home consumer products?!
Logically, they could make all sub $300 monitors matte, as maybe in that range the target demographic might not be able to afford curtains (??)
And all premium monitors glossy since if you can afford them, you can most likely also afford curtains.

Matte vs glossy is one of the most serious 1st world problems, really.


----------



## boredgunner

Quote:


> Originally Posted by *Morkai*
> 
> Yeah the matte dominance is so weird. I completely get it for offices or public spaces with daylight leaking in, but for home consumer products?!
> Logically, they could make all sub $300 monitors matte, as maybe in that range the target demographic might not be able to afford curtains (??)
> And all premium monitors glossy since if you can afford them, you can most likely also afford curtains.
> 
> Matte vs glossy is one of the most serious 1st world problems, really.


What makes it more weird is that higher end TVs are almost always glossy, so why aren't monitors designed purely for entertainment? Makes no sense. Glossy is infinitely better for entertainment use.

Anyway I am starting to get curious about the 35" 3440x1440 version of this monitor just in case the PG27UQ/Predator X27 do end up in the $2500-3000 range (but I doubt they will in the US). I hope motion clarity on the 35" monitors isn't terrible at 120 Hz. I've been eager to replace my XB270HU for two years...


----------



## ESRCJ

The more I think about it, the less likely I see myself getting the PG27UQ. I'm currently using the PG348Q (3440x1440 100Hz) and 16:9 with higher pixel density seems more like a side-grade. If only there were a 5120x2160 100Hz display launching this year. That would be perfect and would be a monitor I could keep for 3-4 years without itching for yet another upgrade.


----------



## Malinkadink

Quote:


> Originally Posted by *gridironcpj*
> 
> The more I think about it, the less likely I see myself getting the PG27UQ. I'm currently using the PG348Q (3440x1440 100Hz) and 16:9 with higher pixel density seems more like a side-grade. If only there were a 5120x2160 100Hz display launching this year. That would be perfect and would be a monitor I could keep for 3-4 years without itching for yet another upgrade.


If it wasn't for HDR i would agree on it being a side grade but the HDR easily makes it an upgrade.


----------



## ESRCJ

Quote:


> Originally Posted by *Malinkadink*
> 
> If it wasn't for HDR i would agree on it being a side grade but the HDR easily makes it an upgrade.


That's true, the HDR is a big selling point and something I would like. Although, I would gladly take 5120x2160 100Hz without HDR over 3840x2160 144Hz with HDR.


----------



## CallsignVega

Yes HDR is a game changer. Just like G-Sync is. Combine G-Sync with 1000 nit HDR, you're on to something. Even if it is a crappy LCD panel.


----------



## kot0005

Quote:


> Originally Posted by *CallsignVega*
> 
> Yes HDR is a game changer. Just like G-Sync is. Combine G-Sync with 1000 nit HDR, you're on to something. Even if it is a crappy LCD panel.


plus Qdot and FALD.


----------



## Hunched

HDR on LCD is nothing but a meme, even FALD HDR1000.


----------



## pez

Quote:


> Originally Posted by *Hunched*
> 
> HDR on LCD is nothing but a meme, even FALD HDR1000.


Sony's LCDs with FALD and HDR say 'Hi'. They are excellent sets and I believe have one of the best performers of 2017 for it's price range.


----------



## CallsignVega

Quote:


> Originally Posted by *kot0005*
> 
> plus Qdot and FALD.


Ya but I roll up FALD under HDR. An LCD isn't going to have proper HDR if it doesn't have FALD. As a matter of fact, my Dell UP2718Q arrives today to test out. I love me some HDR.


----------



## kot0005

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya but I roll up FALD under HDR. An LCD isn't going to have proper HDR if it doesn't have FALD. As a matter of fact, my Dell UP2718Q arrives today to test out. I love me some HDR.


post some photos when you get it


----------



## boredgunner

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya but I roll up FALD under HDR. An LCD isn't going to have proper HDR if it doesn't have FALD. As a matter of fact, my Dell UP2718Q arrives today to test out. I love me some HDR.


Eager to read your impressions. FALD will be less effective with IPS, but how much less?


----------



## sblantipodi

Quote:


> Originally Posted by *gridironcpj*
> 
> The more I think about it, the less likely I see myself getting the PG27UQ. I'm currently using the PG348Q (3440x1440 100Hz) and 16:9 with higher pixel density seems more like a side-grade. If only there were a 5120x2160 100Hz display launching this year. That would be perfect and would be a monitor I could keep for 3-4 years without itching for yet another upgrade.


please don't forget to post your impressions OLED man


----------



## Morkai

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya but I roll up FALD under HDR. An LCD isn't going to have proper HDR if it doesn't have FALD. As a matter of fact, my Dell UP2718Q arrives today to test out. I love me some HDR.


I saw a video of it and it looked like the dimming had huuge input lag (as it's not the g-sync hdr controller), but will be interesting to hear.
It looked like it would be completely useless for anything gaming related.. more or less only good for stills or maybe video..


----------



## sblantipodi

Quote:


> Originally Posted by *Morkai*
> 
> I saw a video of it and it looked like the dimming had huuge input lag (as it's not the g-sync hdr controller), but will be interesting to hear.
> It looked like it would be completely useless for anything gaming related.. more or less only good for stills or maybe video..


it could be even the camera, it's difficult to catch that on camera.


----------



## CallsignVega

So I just calibrated my exposure as best I could to match what I am seeing in real life. This is a black image with the Dell FALD:



And a stuck red pixel on an Ultrasharp too.


----------



## MistaSparkul

Quote:


> Originally Posted by *CallsignVega*
> 
> So I just calibrated my exposure as best I could to match what I am seeing in real life. This is a black image with the Dell FALD:
> 
> 
> 
> And a stuck red pixel on an Ultrasharp too.


This is what you're actually seeing in real life? Oh boy these 144hz gsync versions have some serious work to do, if they can even do anything about it in the first place.


----------



## ToTheSun!

Quote:


> Originally Posted by *MistaSparkul*
> 
> This is what you're actually seeing in real life? Oh boy these 144hz gsync versions have some serious work to do, if they can even do anything about it in the first place.


We saw blooming with compressed youtube videos way back in 2017, and it's likely that's how they'll be on release.

Really, our only shot at a good monitor is if someone takes that 22'' OLED (or something bigger in the meantime) and puts some HDMI2.1 on it.


----------



## ryan92084

Quote:


> Originally Posted by *CallsignVega*
> 
> So I just calibrated my exposure as best I could to match what I am seeing in real life. This is a black image with the Dell FALD:
> 
> 
> 
> And a stuck red pixel on an Ultrasharp too.


If that's what you are seeing looking straight on at a usable brightness it's pretty bad. Worse than my ~10 year old FALD Toshiba.


----------



## Malinkadink

Quote:


> Originally Posted by *CallsignVega*
> 
> So I just calibrated my exposure as best I could to match what I am seeing in real life. This is a black image with the Dell FALD:
> 
> 
> 
> And a stuck red pixel on an Ultrasharp too.


If that's 100% brightness thats bad but expected, if thats 100 nits, then thats really bad. I assume its not adjusted for exposure so the camera makes it out worse than it really is, but for 384 zones on a 27 inch display thats really awful.


----------



## boredgunner

LCD should be banned.


----------



## Hunched

Quote:


> Originally Posted by *Hunched*
> 
> HDR on LCD is nothing but a meme, even FALD HDR1000.


----------



## CallsignVega

Quote:


> Originally Posted by *Malinkadink*
> 
> If that's 100% brightness thats bad but expected, if thats 100 nits, then thats really bad. I assume its not adjusted for exposure so the camera makes it out worse than it really is, but for 384 zones on a 27 inch display thats really awful.


I've stated I've adjusted the exposure to get it close to what I am seeing in real life. It is actually that bad as depicted. The FALD only works in HDR mode, which locks out brightness controls. What you see is what you get. Outside of HDR mode it's just a regular ~350 nit 4K IPS panel, nothing impressive. Granted bright scene HDR content looks great, but if you play play games or movies with any dark areas it looks atrocious.

LCD HDR = fail. I'll only ever concern myself with HDR using an OLED. 27" 4K looks great on the desktop but isn't very impressive in games. I'll be skipping these new FALD gaming displays.


----------



## Hunched

This is the start of something very bad.
New high end monitors without FALD/HDR will become less common and harder to find.
And even though you can use still them as regular monitors you'll still be paying the extra price for it


----------



## ryan92084

Quote:


> Originally Posted by *CallsignVega*
> 
> I've stated I've adjusted the exposure to get it close to what I am seeing in real life. It is actually that bad as depicted. The FALD only works in HDR mode, which locks out brightness controls. What you see is what you get. Outside of HDR mode it's just a regular ~350 nit 4K IPS panel, nothing impressive. Granted bright scene HDR content looks great, but if you play play games or movies with any dark areas it looks atrocious.
> 
> LCD HDR = fail. I'll only ever concern myself with HDR using an OLED. 27" 4K looks great on the desktop but isn't very impressive in games. I'll be skipping these new FALD gaming displays.


Yeah that's just bad all the way around then.
I guess they don't expect you to do desktop activities with hdr on so the brightness would be less of an issue. But why lock local dining to only hdr? Very lame


----------



## MistaSparkul

Quote:


> Originally Posted by *Hunched*
> 
> This is the start of something very bad.
> New high end monitors without FALD/HDR will become less common and harder to find.
> And even though you can use still them as regular monitors you'll still be paying the extra price for it


Correct me if I'm wrong but I do believe AUO also has a non FALD of this panel. Just a normal 4k 144hz IPS panel. That would probably be more appealing as it will be cheaper so you wont pay extra for a half useless FALD implementation.


----------



## Hunched

Quote:


> Originally Posted by *MistaSparkul*
> 
> Correct me if I'm wrong but I do believe AUO also has a non FALD of this panel. Just a normal 4k 144hz IPS panel. That would probably be more appealing as it will be cheaper so you wont pay extra for a half useless FALD implementation.


Idk, I hope AUO and the rest of the panel manufacturers make the cheaper variants available, and that those putting them in monitors and selling them care enough to do so.
But im pretty sure Asus, Acer, all of them see HDR as a selling point. Not having HDR as opposed to having it isn't favorable "why would anyone prefer not to have HDR?", they want to tick all the boxes.

They're going to view selling a high end monitor without HDR/FALD the same way as selling a high end monitor without adaptive sync.
They're not going to think anything is unappealing about HDR like we see, this is going to be part of everything now and it sucks.


----------



## Pokiehat

Quote:


> Originally Posted by *CallsignVega*
> 
> I've stated I've adjusted the exposure to get it close to what I am seeing in real life. It is actually that bad as depicted. The FALD only works in HDR mode, which locks out brightness controls. What you see is what you get. Outside of HDR mode it's just a regular ~350 nit 4K IPS panel, nothing impressive. Granted bright scene HDR content looks great, but if you play play games or movies with any dark areas it looks atrocious.
> 
> LCD HDR = fail. I'll only ever concern myself with HDR using an OLED. 27" 4K looks great on the desktop but isn't very impressive in games. I'll be skipping these new FALD gaming displays.


Damn. You just killed this monitor for me. And the OLED dream is still so far away...


----------



## Glerox

It's a shame but honestly I've been waiting for years for a fast refresh rate 4k monitor...

my rig is eagerly waiting a monitor to challenge its dual Titan XP...

I'll buy the first I can... worst case scenario I turn off HDR


----------



## Hunched

I wonder if we will get a 100hz+ OLED monitor by 2020


----------



## CallsignVega

Honestly, at _27 inches_ the slight _gaming_ visual fidelity increase over 1440p isn't worth the 127% increased GPU load of 4K. If it were 32-40 inches, it would be.


----------



## Glerox

Quote:


> Originally Posted by *CallsignVega*
> 
> Honestly, at _27 inches_ the slight _gaming_ visual fidelity increase over 1440p isn't worth the 127% increased GPU load of 4K. If it were 32-40 inches, it would be.


That is a matter of personal taste. I've tried countless monitors of all types and seeing the pixels at 1440p bothers me in gaming and in productivity tasks.

the 4k ppi at 27 inches is perfect for me, but overkill for many.

If 1440p was fine for me than I would go for the alienware aw3418dw and call it a day instead of waiting for the upcoming 200hz pg35vq/x35 knowing that FALD looks problematic...


----------



## boredgunner

I look forward to telling younger gamers in 5 years or so that I had to wait 15+ years for a good gaming monitor.


----------



## CallsignVega

Don't get me wrong 4K 27" does look better than 27" 1440p. But certainly not 60 FPS vs 136 FPS better. It's also not the massive difference of 1440p 27" over 1080p" 27". With small 4K displays you are in some nice diminishing returns territory.


----------



## Hunched

Quote:


> Originally Posted by *boredgunner*
> 
> I look forward to telling younger gamers in 5 years or so that I had to wait 15+ years for a good gaming monitor.


Only 5 more years?


----------



## Malinkadink

Quote:


> Originally Posted by *boredgunner*
> 
> I look forward to telling younger gamers in 5 years or so that I had to wait 15+ years for a good gaming monitor.


CRTs are still good if you can find the FW900 that isn't dead lol and you're okay with a 100 pound behemoth on your desk.


----------



## ArasakOl

CallsignVega,

I also made the mistake of purchasing the Dell UP2718Q. If you haven't already done so, pack it up and return it IMMEDIATELY to get that money back. The monitor is complete garbage no matter what you try to do to make it workable.

(If anyone is interested in my personal first hand lengthy review of the UP2718Q, please read it here:

__
https://www.reddit.com/r/6unnn7/my_horrible_experience_with_the_dell_up2718q/%5B/URL

With that said, there is SOME hope that Asus has found a way of at least addressing the terrible FALD latency problem. I say this because a few days ago on the Asus PA32UQ product page (a similar monitor with 384 zones), they stated that the backlight had some [Asus marketing name] .1ms response time. (This specifically was referring the FALD response time.)

However, it now appears that Asus has completely deleted the PA32UQ product page entirely from their website! Literally no reference to it appears to exist, and they had a product page up for it ever since CES 2017. This does not bode well for the PG27UQ.....


----------



## Hunched

Quote:


> Originally Posted by *ArasakOl*
> 
> With that said, there is SOME hope that Asus has found a way of at least addressing the terrible FALD latency problem. I say this because a few days ago on the Asus PA32UQ product page (a similar monitor with 384 zones), they stated that the backlight had some [Asus marketing name] .1ms response time. (This specifically was referring the FALD response time.)


Hahaha... I bet its exorbitant transition time begins after just 0.1ms of receiving the signal.

Asus makes a car, acceleration 0-60 0.1ms.
In just 0.1ms of pushing the pedal our vehicle will begin its 11.7 second transition from 0-60.
BLAZING fast response! The time from standstill to first movement is faster than you can blink!
Its just all downhill from there


----------



## ESRCJ

Quote:


> Originally Posted by *ArasakOl*
> 
> CallsignVega,
> 
> I also made the mistake of purchasing the Dell UP2718Q. If you haven't already done so, pack it up and return it IMMEDIATELY to get that money back. The monitor is complete garbage no matter what you try to do to make it workable.
> 
> (If anyone is interested in my personal first hand lengthy review of the UP2718Q, please read it here:
> 
> __
> https://www.reddit.com/r/6unnn7/my_horrible_experience_with_the_dell_up2718q/%5B/URL
> 
> With that said, there is SOME hope that Asus has found a way of at least addressing the terrible FALD latency problem. I say this because a few days ago on the Asus PA32UQ product page (a similar monitor with 384 zones), they stated that the backlight had some [Asus marketing name] .1ms response time. (This specifically was referring the FALD response time.)
> 
> However, it now appears that Asus has completely deleted the PA32UQ product page entirely from their website! Literally no reference to it appears to exist, and they had a product page up for it ever since CES 2017. This does not bode well for the PG27UQ.....


Well, I guess I'll just be sticking with my PG348Q for the next year. No needs for SLI'ed Ampere either on 3440x1440.


----------



## ArasakOl

Update - looks like the PA32UQ has been removed because it has been renamed the PA32UC. (odd.....)

Anyway, the Asus website says that the FALD "ASUS LED driving technology achieves 1 micro second operation for better HDR performance (1000 cd/m², peak)

https://www.asus.com/Monitors/PA32UC/

Maybe this "ASUS LED driving technology" will be what saves these monitors from the fate of the UP2718Q.....

Another side note - B&H now has it available for pre-order in "Limited Qty" for February shipment.... at $2000......

https://www.bhphotovideo.com/c/product/1380030-REG/asus_pa32uc_32_uhd_hdr_adobe.html

The B&H site says "a static contrast ratio of 1000:1, an ASUS Smart Contrast Ratio of 100,000,000:1"


----------



## Morkai

The fald controller is the gsync module so asus has nothing to do with it, it will perform identically on all vendors.
In videos it looks like it has 0 delay on dimming.

It is a fairly easy thing to implement when you have a g-sync module accepting packet data along with each frame.
Quickly caluclate peak/average/minimum brightness for each dimming zone while rendering each frame (maybe prioritize minimum brightness if near black, peak if zone contains bright light, or just always go average - doesnt matter), send that along with each frame as packet data, turn on/off/dim each zone instantaneously.
Without a doubt this is at least an order of magnitude faster than the lcd pixel response, and they can probably do it more cleverly than my example.

The halos should be exactly the same though but most gaming content is rarely black and white. 384 zones are not many (its a grid of 24x16 pretty big zones). Big fald zones like that really show off what a horrible tech IPS is in terms of contrast. Maybe the 21:9 VA monitors will look a touch better, but then again the top of the line fald VA tv's look far from perfect as well.


----------



## Glerox

Quote:


> Originally Posted by *ArasakOl*
> 
> CallsignVega,
> 
> I also made the mistake of purchasing the Dell UP2718Q. If you haven't already done so, pack it up and return it IMMEDIATELY to get that money back. The monitor is complete garbage no matter what you try to do to make it workable.
> 
> (If anyone is interested in my personal first hand lengthy review of the UP2718Q, please read it here:
> 
> __
> https://www.reddit.com/r/6unnn7/my_horrible_experience_with_the_dell_up2718q/%5B/URL
> 
> With that said, there is SOME hope that Asus has found a way of at least addressing the terrible FALD latency problem. I say this because a few days ago on the Asus PA32UQ product page (a similar monitor with 384 zones), they stated that the backlight had some [Asus marketing name] .1ms response time. (This specifically was referring the FALD response time.)
> 
> However, it now appears that Asus has completely deleted the PA32UQ product page entirely from their website! Literally no reference to it appears to exist, and they had a product page up for it ever since CES 2017. This does not bode well for the PG27UQ.....


In the tftcentral review, they said that the FALD blooming effect was mostly apparent in HDR "vivid" mode. Have you tried it in HDR "standard" mode?


----------



## CallsignVega

Quote:


> Originally Posted by *ArasakOl*
> 
> CallsignVega,
> 
> I also made the mistake of purchasing the Dell UP2718Q. If you haven't already done so, pack it up and return it IMMEDIATELY to get that money back. The monitor is complete garbage no matter what you try to do to make it workable.
> 
> (If anyone is interested in my personal first hand lengthy review of the UP2718Q, please read it here:
> 
> __
> https://www.reddit.com/r/6unnn7/my_horrible_experience_with_the_dell_up2718q/%5B/URL
> 
> With that said, there is SOME hope that Asus has found a way of at least addressing the terrible FALD latency problem. I say this because a few days ago on the Asus PA32UQ product page (a similar monitor with 384 zones), they stated that the backlight had some [Asus marketing name] .1ms response time. (This specifically was referring the FALD response time.)
> 
> However, it now appears that Asus has completely deleted the PA32UQ product page entirely from their website! Literally no reference to it appears to exist, and they had a product page up for it ever since CES 2017. This does not bode well for the PG27UQ.....


I agree with everything you have said. Good thing I bought this thing from Amazon for easy return. Would be going back anyway since it has a stuck red pixel.

This is Shadow Warrior 2 - ANY dark areas with hud elements etc have massive blooming and looks terrible:



Quote:


> Originally Posted by *Glerox*
> 
> In the tftcentral review, they said that the FALD blooming effect was mostly apparent in HDR "vivid" mode. Have you tried it in HDR "standard" mode?


Yes, the standard mode is *terrible.*

I don't think these FALD displays will be used in FALD mode much and will be relegated to normal backlight mode. You need HDR for FALD anyway, at least on the Dell. Not clear on these G-Sync panels.


----------



## Scotty99

I called this the day they were announced, FALD was going to be terrible as its hard to do right. They need to get ahold of the engineers at vizio, they are leaders in FALD.


----------



## Scotty99

Quote:


> Originally Posted by *Morkai*
> 
> The fald controller is the gsync module so asus has nothing to do with it, it will perform identically on all vendors.
> In videos it looks like it has 0 delay on dimming.
> 
> It is a fairly easy thing to implement when you have a g-sync module accepting packet data along with each frame.
> Quickly caluclate peak/average/minimum brightness for each dimming zone while rendering each frame (maybe prioritize minimum brightness if near black, peak if zone contains bright light, or just always go average - doesnt matter), send that along with each frame as packet data, turn on/off/dim each zone instantaneously.
> Without a doubt this is at least an order of magnitude faster than the lcd pixel response, and they can probably do it more cleverly than my example.
> 
> The halos should be exactly the same though but most gaming content is rarely black and white. 384 zones are not many (its a grid of 24x16 pretty big zones). Big fald zones like that really show off what a horrible tech IPS is in terms of contrast. Maybe the 21:9 VA monitors will look a touch better, but then again the top of the line fald VA tv's look far from perfect as well.


Vizio's highest end panel in the 55" size uses IPS, owners of said TV say they cannot differentiate it from the VA counterpart in 65" size. FALD can be amazing when done right, panel tech really isnt a limiting factor here.


----------



## Morkai

Quote:


> Originally Posted by *Scotty99*
> 
> Vizio's highest end panel in the 55" size uses IPS, owners of said TV say they cannot differentiate it from the VA counterpart in 65" size. FALD can be amazing when done right, panel tech really isnt a limiting factor here.


Looks like it is 128 zones and just as bad with black on white at least. This video is the VA version, so obviously the IPS version would look much worse.




I don't see any possible way to get around these issues except for greatly increasing the number of zones (IPS or VA with say, 0.5x0.5cm sized zones, or smaller, would probably be really good.10000 zones or more) The dell would also look better if it was possible to control the brightness (that vizio line has the ability to control the backlight brightness in hdr mode), lets hope the g-sync hdr versions do too so that there's a sliver of hope







.


----------



## Sancus

I assume these issues are why the displays have been delayed so long. We've known for a long time that the FALD is only going to work in HDR mode, which is pretty disappointing I have to admit. If we're going to be paying $2K+ for these monitors they should perform better than other monitors in SDR mode and I don't see any reason the FALD can't always be enabled.

TFTCentral benchmarked the FALD latency on the UP2718Q and it was a whopping 600ms for black-to-white and 200ms to turn off. That's insanely bad. DisplayHDR 1000(which I assume the PG27UQ and PG35VQ MUST meet) requires 8 frames black to white response time from the backlight which is 67ms at 120hz and even less at 200hz. The UP2718Q is a horribly flawed, overpriced early adopter product and I don't think it's totally reasonable to say that the PG27UQ or 35VQ will necessarily have the same flaws.... if they do, Nvidia/AUO/Asus really screwed up, because otherwise why have they kept them in development for a full year past their original slated launch windows?

Needless to say if these monitors suck at release I will return mine or just wait for reviews, but jumping to conclusions based on a different, rushed product from last year and brief CES impression videos isn't the best idea imo.


----------



## ArasakOl

Quote:


> Originally Posted by *Sancus*
> 
> I assume these issues are why the displays have been delayed so long. We've known for a long time that the FALD is only going to work in HDR mode, which is pretty disappointing I have to admit. If we're going to be paying $2K+ for these monitors they should perform better than other monitors in SDR mode and I don't see any reason the FALD can't always be enabled.
> 
> TFTCentral benchmarked the FALD latency on the UP2718Q and it was a whopping 600ms for black-to-white and 200ms to turn off. That's insanely bad. DisplayHDR 1000(which I assume the PG27UQ and PG35VQ MUST meet) requires 8 frames black to white response time from the backlight which is 67ms at 120hz and even less at 200hz. The UP2718Q is a horribly flawed, overpriced early adopter product and I don't think it's totally reasonable to say that the PG27UQ or 35VQ will necessarily have the same flaws.... if they do, Nvidia/AUO/Asus really screwed up, because otherwise why have they kept them in development for a full year past their original slated launch windows?
> 
> Needless to say if these monitors suck at release I will return mine or just wait for reviews, but jumping to conclusions based on a different, rushed product from last year and brief CES impression videos isn't the best idea imo.


My biggest fear is that Linus (anyone who knows Linus knows he is very reliable when it comes to monitors) was at CES 2018 and saw the current PG27UQ first hand. He clearly stated that FALD blooming was an issue even when moving the mouse. (In other words, the same issue the UP2718Q exhibits here in this video: 



)

Hopefully, the issue will be resolved if and when these monitors ship. However, assuming they have the same issue as the UP2718Q, these monitors are effectively dead on arrival. I cannot imagine any gamer tolerating this kind of issue.... let alone for $2000+.

My hope is that Nvidia has been on top of the issue... I cannot imagine that they would allow their "Gsync HDR" to be ruined with such an issue. That would be a major hit to their brand and the millions spent in R&D for this technology.

Also, a European site that has already listed the Acer X27 for sale says that the X27 is NOT VESA HDR 1000 compliant... only that it is "equivalent" to it..... It is HDR Ultra certified, however.


----------



## sblantipodi

Quote:


> Originally Posted by *Sancus*
> 
> I assume these issues are why the displays have been delayed so long. We've known for a long time that the FALD is only going to work in HDR mode, which is pretty disappointing I have to admit. If we're going to be paying $2K+ for these monitors they should perform better than other monitors in SDR mode and I don't see any reason the FALD can't always be enabled.
> 
> TFTCentral benchmarked the FALD latency on the UP2718Q and it was a whopping 600ms for black-to-white and 200ms to turn off. That's insanely bad. DisplayHDR 1000(which I assume the PG27UQ and PG35VQ MUST meet) requires 8 frames black to white response time from the backlight which is 67ms at 120hz and even less at 200hz. The UP2718Q is a horribly flawed, overpriced early adopter product and I don't think it's totally reasonable to say that the PG27UQ or 35VQ will necessarily have the same flaws.... if they do, Nvidia/AUO/Asus really screwed up, because otherwise why have they kept them in development for a full year past their original slated launch windows?
> 
> Needless to say if these monitors suck at release I will return mine or just wait for reviews, but jumping to conclusions based on a different, rushed product from last year and brief CES impression videos isn't the best idea imo.


completely agree with you.
I doubt that Asus and Acer will have this problems


----------



## CallsignVega

Sorry guys there is no way around FALD blooming. The only reason they are even using it is to reach 1000 nit for HDR. I'd rather have crappy blacks from uniform edge lighting with a solid break in images like moving mouse cursors than have a big ghost following the cursor and around every HUD item.

The million dollar question: does FALD on the new G-Sync displays work in all modes or just HDR?


----------



## Sancus

.
Quote:


> Originally Posted by *CallsignVega*
> 
> Sorry guys there is no way around FALD blooming. The only reason they are even using it is to reach 1000 nit for HDR. I'd rather have crappy blacks from uniform edge lighting with a solid break in images like moving mouse cursors than have a big ghost following the cursor and around every HUD item.


Well if we are specifically talking about the blacks being brighter in the one zone when you leave your pointer there, yeah, there's no way around that. However, the UP2718Q has horrible trailing where the "bright" zone sticks behind for several hundred milliseconds if not half a second. That's completely unacceptable for content in motion.

They could also significantly reduce the amount bloom with better backlight control depending on the content, too. The mouse pointer doesn't need and shouldn't have full 1000 nits point brightness for example. Nor do UI elements.


----------



## ArasakOl

Quote:


> Originally Posted by *Sancus*
> 
> .
> Well if we are specifically talking about the blacks being brighter in the one zone when you leave your pointer there, yeah, there's no way around that. However, the UP2718Q has horrible trailing where the "bright" zone sticks behind for several hundred milliseconds if not half a second. That's completely unacceptable for content in motion.
> 
> They could also significantly reduce the amount bloom with better backlight control depending on the content, too. The mouse pointer doesn't need and shouldn't have full 1000 nits point brightness for example. Nor do UI elements.


Bro.... the mouse pointer on the UP2718Q is NOT getting lit up with 1000 nits.... not even close. What you are seeing is probably 250-300 nits of brightness on the pointer. Yes, that's all it takes for it to look that bad.

The "trailing" is latency. It's basically closely related in concept to ghosting, but from the FALD instead of the panel itself. It DOES sound like the new Asus/Acer panels will at least have largely eliminated the latency issue, at least. However, the blooming potentially will be just as bad unless they have found some way to block the light from coming through the panel. (Who knows, maybe they have invented a way to do this on IPS panels....)


----------



## ArasakOl

Quote:


> Originally Posted by *CallsignVega*
> 
> Sorry guys there is no way around FALD blooming. The only reason they are even using it is to reach 1000 nit for HDR. I'd rather have crappy blacks from uniform edge lighting with a solid break in images like moving mouse cursors than have a big ghost following the cursor and around every HUD item.
> 
> The million dollar question: does FALD on the new G-Sync displays work in all modes or just HDR?


I've been following these panels for a while, and the general consensus has been that FALD will only work in HDR mode but this has never been officially confirmed.

I also saw a small note at one point from Acer or Asus (can't remember which) that Microsoft is planning to release an HDR update to finally fix HDR in Windows possibly around the time the panels launch.

The other big question is rather or not these panels support basic HDR10 via HDMI. (VERY important for those of us who plan to use it with a PS4/Xbox/4k Blu-Ray player....) If not, this will be another major deal breaker and unacceptable for a $2000+ 27" screen.


----------



## Sancus

Quote:


> Originally Posted by *ArasakOl*
> 
> Bro.... the mouse pointer on the UP2718Q is NOT getting lit up with 1000 nits.... not even close. What you are seeing is probably 250-300 nits of brightness on the pointer. Yes, that's all it takes for it to look that bad.


I'm not seeing anything, though. I'm just looking at videos and photos where there is no real way to sync my monitor settings+room lighting and the photographer's exposure settings with reality. People post plenty of videos of the 'totally unacceptable backlight bleed on M270 panels' and most(though not all) examples of them are fine for normal use. If the 'bloomed' black level on the PG27UQ is the same as the standard black level on my Acer XB270HU, for example, and the rest is just fully black because the backlight is turned off, that is a totally acceptable amount of blooming to me and imo is totally acceptable to most consumers. If it's not an acceptable amount of blooming to you then basically no existing IPS panel will ever work. Probably the only hope is Panasonic's stacked panel tech in their $17K broadcast monitor E:wrong product, I was referring to this but the only product I can find using the tech is the Eizo CG3145 which doesn't seem buyable yet.

I've noticed people are focusing on the PG27UQ in their complaints about bloom. It's possible that the much lower native black levels of the VA PG35VQ makes the bloom harder to notice, and that despite the dark trailing issues of it, the VA panel HDR implementation is actually the way to go because it mitigates all these bloom issues.

People's expectations need to be set at a realistic level for the PG27UQ. Adding FALD does not make the panel magic. It's still an IPS panel with, at best, 1000:1 native contrast ratio and a pretty high black level. If that's not what you want, then you need to just wait for OLED or microLED monitors, or use an OLED TV as a monitor.


----------



## Glerox

I agree, in the end it will be the first 4k fast refresh rate monitor and worth a premium, whatever HDR or not


----------



## Hunched

Quote:


> Originally Posted by *Glerox*
> 
> I agree, in the end it will be the first 4k fast refresh rate monitor and worth a premium, whatever HDR or not


$3000 price tag worth it for a LCD display? Hahaha...
Just because it has a higher refresh rate than the other 4K IPS displays, no.


----------



## Glerox

Yes 3000$ would be crazy, but it will more around 2000$ I think. Top of the line ultrawide monitors launched at 1500$. Now weither you prefer ultrawide or pure pixel density is your taste.


----------



## Malinkadink

Quote:


> Originally Posted by *Hunched*
> 
> $3000 price tag worth it for a LCD display? Hahaha...
> Just because it has a higher refresh rate than the other 4K IPS displays, no.


For real, 1440p displays that are 144hz can be had for as low as some 1080p 144hz displays these days, absolutely no reason for a high refresh rate 4k panel to cost that much. It's really not like it was an enormous undertaking to achieve either, in regards to R&D. The biggest limitation was always the bandwidth required, but now with DP 1.4 and DSC or HDMI 2.1 there is oodles of bandwidth for higher resolutions with higher refresh rates.


----------



## zeall0rd

To be honest, I think those monitors should be making an appearance very soon. In germany, it's impossible to get your hands on 4K60Hz Gsync IPS right now.


----------



## Clukos

Quote:


> Originally Posted by *CallsignVega*
> 
> Honestly, at _27 inches_ the slight _gaming_ visual fidelity increase over 1440p isn't worth the 127% increased GPU load of 4K. If it were 32-40 inches, it would be.


This, I've been using a 32" 4k VA monitor for a while and even that is a bit small for 4K. 4K at 27" is just dumb, especially with how bad Windows is dealing with scaling, text looks especially bad even at 125%. Gaming at 4K is quite nice when it works though, some games just _work_ at that res.


----------



## rvectors

Quote:


> Originally Posted by *Clukos*
> 
> This, I've been using a 32" 4k VA monitor for a while and even that is a bit small for 4K. 4K at 27" is just dumb, especially with how bad Windows is dealing with scaling, text looks especially bad even at 125%. Gaming at 4K is quite nice when it works though, some games just _work_ at that res.


^Sweeping statement alert.

My 5k @ 27inch, a beautiful viewing experience, says it's down to personal choice.


----------



## Clukos

Quote:


> Originally Posted by *rvectors*
> 
> it's down to personal choice.


Windows having terrible scaling isn't down to personal preference though. Text looks terrible at anything above 125% scaling, and even that is barely tolerable. If you are using Linux or even better, macOS, it's not that much of a problem as they both handle text and scaling much better.

150% scaling:


Native (100%):


----------



## Alex11223

Quote:


> Originally Posted by *Clukos*
> 
> Text looks terrible at anything above 125% scaling, and even that is barely tolerable


It depends on app. CPU-Z is clearly an old app that doesn't handle scaling, so Windows just scales it as an image (it probably doesn't have any other choice, there are too much different technologies for building GUI and that's their job to handle such things, Windows cannot do that).

See for example Windows UI vs CPU-Z


----------



## Scotty99

Yup 100% app dependent. Most text looks great even better at 125% scaling, whats funny is hardware monitoring programs are the ones ive noticed to have the worst looking text (again, down to whoever codes these programs).

Also i prefer higher DPI displays as well, i could have gotten the 27" version of my monitor for only 40 dollars more but i went with 24" (1440p).


----------



## profundido

Quote:


> Originally Posted by *Clukos*
> 
> This, I've been using a 32" 4k VA monitor for a while and even that is a bit small for 4K. 4K at 27" is just dumb, especially with how bad Windows is dealing with scaling, text looks especially bad even at 125%. Gaming at 4K is quite nice when it works though, some games just _work_ at that res.


Your information on this is outdated Clukos. About the native scaling part I agree of course. 4K only becomes 90-100 ppi again on screens of at least 38"-40" so on a 27" screen that's not an option. And yes Windows 10 first editions had really poor scaling support but it handles scaling increasingly well now ever since the changes they included in build 1703. I suggest you upgrade to Creators edition (1709) and have another look at the current applications and the new options.

Since W10 build 1703 you can edit the properties of any legacy application and choose between 3 different high DPI scaling settings to handle the DPI changes. Works quite well tbh. Is it as simple and easy and fully supported ad MAC OS or Linux as in that no intervention is required ? No, it's still a work in progress, but it's no longer in the state where it's a problem that simply cannot be fixed or remedied.

Other software vendors are catching up as well to the whole new high DPI scene. As of Citrix client v 4.10.1 Citrix now has support for high DPI settings in it and a means to follow native scaling instead. I expect Microsoft will soon implement this feature as well in a future version of RDP.


----------



## ryan92084

Quote:


> Originally Posted by *Clukos*
> 
> Windows having terrible scaling isn't down to personal preference though. Text looks terrible at anything above 125% scaling, and even that is barely tolerable. If you are using Linux or even better, macOS, it's not that much of a problem as they both handle text and scaling much better.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 150% scaling:
> 
> 
> Native (100%):


200% scaling with different settings to expand on profundido's point. Make sure to veiw original since OCN destroys images in the gallery.

Left to right default and ignore system setting, ignore system enhanced (cuts off bottom), ignore application. None of them look as bad as yours.


----------



## Neon Lights

Acer Predator X27 delayed (again), possibly ASUS PG27UQ too, according to this article: https://www.sweclockers.com/nyhet/2...4-hz-och-nvidia-g-sync-forsenas-till-sommaren


----------



## toncij

Neon Lights said:


> Acer Predator X27 delayed (again), possibly ASUS PG27UQ too, according to this article: https://www.sweclockers.com/nyhet/2...4-hz-och-nvidia-g-sync-forsenas-till-sommaren


Those will probably get delayed to the end of the year or cancelled altogether. Let's hope ultrawides on 200Hz don't.


----------



## Glerox

I found a good deal on the aw3418dw, I just bought it. I'm not even sure the 200hz ultrawide will be more responsive because it's VA...


----------



## toncij

Glerox said:


> I found a good deal on the aw3418dw, I just bought it. I'm not even sure the 200hz ultrawide will be more responsive because it's VA...


For responsiveness, I have PG279Q or even PG278Q or that Eizo... but I really sometimes miss awesome contrast of those VA panels.


----------



## CallsignVega

Summer now, just crazy. 120 Hz OLED is going to be out before this crap!


----------



## keikei

CallsignVega said:


> Summer now, just crazy. 120 Hz OLED is going to be out before this crap!


I may have to suck it up and just wait. HDR IPS is _slightly_ compelling right now, but dat OLED/high hz is *just around the corner.


----------



## animeowns

Neon Lights said:


> Acer Predator X27 delayed (again), possibly ASUS PG27UQ too, according to this article: https://www.sweclockers.com/nyhet/2...4-hz-och-nvidia-g-sync-forsenas-till-sommaren


Nooooooooooo say it isn't so if its delayed again I might just grab a 120hz 3440x1440 ultrawide ips the alien I am waiting on the pg27uq or the predator x27


----------



## sblantipodi

CallsignVega said:


> Summer now, just crazy. 120 Hz OLED is going to be out before this crap!


hope that all the OLED fanboys will lower the price of real pc monitors.

OLED is simply not ready for PC usage, they have severe burn in problems after few months and discoloration after few years.
Thanks, but no thanks.


----------



## Sancus

sblantipodi said:


> hope that all the OLED fanboys will lower the price of real pc monitors.
> 
> OLED is simply not ready for PC usage, they have severe burn in problems after few months and discoloration after few years.
> Thanks, but no thanks.


Even if this was true(it's not, avoiding burn-in just requires you to take some reasonable care), I'd more than happily pay $2K for a 30" 4K 120hz OLED monitor that had a 3 year lifespan. And then I'd pay $2K again 3 years later for the upgraded model.

It's just too vastly superior to LCD in literally every way to even consider LCD if you have the option. The response time difference alone is a deal breaker for LCDs given both are high refresh rate.


----------



## CallsignVega

sblantipodi said:


> hope that all the OLED fanboys will lower the price of real pc monitors.
> 
> OLED is simply not ready for PC usage, they have severe burn in problems after few months and discoloration after few years.
> Thanks, but no thanks.


lol OK. I'll take my OLED monitor, you can have 1990's LCD tech.


----------



## Martha Stewart

sblantipodi said:


> hope that all the OLED fanboys will lower the price of real pc monitors.
> 
> OLED is simply not ready for PC usage, they have severe burn in problems after few months and discoloration after few years.
> Thanks, but no thanks.



You are confusing Plasma's with OLED's


----------



## ZealotKi11er

In PC its simple. Have different wallpapers with different colours rotating, auto-hide taskbar and no icons and you are set.


----------



## CallsignVega

Yup, almost everything that would benefit from OLED's amazing image quality will be pretty dynamic. Games/movies. If you want a spreadsheet open for 24H straight, there is no need for an OLED in the first place.


----------



## ZealotKi11er

CallsignVega said:


> Yup, almost everything that would benefit from OLED's amazing image quality will be pretty dynamic. Games/movies. If you want a spreadsheet open for 24H straight, there is no need for an OLED in the first place.


The only thing I wish is better HDR support in Windows 10. I do not want to deal with manual adjustment in same games that support HDR.


----------



## steelbom

ZealotKi11er said:


> In PC its simple. Have different wallpapers with different colours rotating, auto-hide taskbar and no icons and you are set.


Basically what I do on my (LED) LG34UC98. I should probably hide the task bar too but it bugs me how it's still slightly visible.


----------



## kot0005

Neon Lights said:


> Acer Predator X27 delayed (again), possibly ASUS PG27UQ too, according to this article: https://www.sweclockers.com/nyhet/2...4-hz-och-nvidia-g-sync-forsenas-till-sommaren


this website seems to be running a miner..beware.


----------



## frunction

kot0005 said:


> this website seems to be running a miner..beware.


I prefer mining over looking at targeted advertising.


----------



## treadstone

now i'm confuzed


----------



## kot0005

Well RIp, that website might be right afterall..


----------



## toncij

It'll get delayed again, no worries.


----------



## keikei

toncij said:


> It'll get delayed again, no worries.


I may just go ahead and get a 4k/hdr monitor than. Seems like oled/high hz ain't coming soon in the near future unfortunately. The pricing for an LG brand isnt too bad actually.


----------



## animeowns

kot0005 said:


> Well RIp, that website might be right afterall..


such bs can't even stick to a Q1 release date they wanna wait until the new video cards release and release the monitors alongside it if I have to wait until may for 4k 144hz goodness they can at least let us pre order it now


----------



## toncij

If you think about it, if they aim for gaming, they have no reason to rush - no single card can run 4K at 60 consistently, let alone 144. Check modern games like AC:Origins...


----------



## animeowns

toncij said:


> If you think about it, if they aim for gaming, they have no reason to rush - no single card can run 4K at 60 consistently, let alone 144. Check modern games like AC:Origins...


I know that I'm waiting on the new cards to release I wanna be able to pre order both at the same time. Titan V has its fun points


----------



## keikei

toncij said:


> If you think about it, if they aim for gaming, they have no reason to rush - no single card can run 4K at 60 consistently, let alone 144. Check modern games like AC:Origins...


They are plenty of games that can benefit though. It doesnt have to reach 144, but any decent bump over 60 and it should be a much better gaming experience. The other hurdle is when devs cap the frames to 60fps, but there are workarounds for some of them. Either way, the more frames above 60 I'm all in for. The demand is clearly there, but the technical hurdles seem to be higher than expected this generation.


----------



## toncij

Yes. I'd love to use it on desktop too, but I need at least 5K...


----------



## Jbravo33

toncij said:


> If you think about it, if they aim for gaming, they have no reason to rush - no single card can run 4K at 60 consistently, let alone 144. Check modern games like AC:Origins...


Titan V already runs ACO at 60 plus FPS in 4k


----------



## Sancus

toncij said:


> If you think about it, if they aim for gaming, they have no reason to rush - no single card can run 4K at 60 consistently, let alone 144. Check modern games like AC:Origins...


AC Origins and similar single player AAA titles from the last 1-2 years aren't really relevant to high framerate displays. Most competitive games easily run at high fps, 1080 TI can do Overwatch 4K at 140+ fps with full Ultra Quality, and needless to say CSGO and mobas probably at 200+ fps. With a 1080 TI and 5.2ghz 8700K, I am CPU bottlenecked in League, Overwatch, and Heroes of the Storm. Competitive games are the ones where it's important to get high framerates.

Everything else would gain some benefit from 60-120fps which also includes most games from the past 5 years.


----------



## Profiled

lol BIG 65" 4k will be release sooner than this


----------



## kot0005

keikei said:


> I may just go ahead and get a 4k/hdr monitor than. Seems like oled/high hz ain't coming soon in the near future unfortunately. The pricing for an LG brand isnt too bad actually.


yeah, am thinking about getting the ASUS ProArt PA32UC. HDR and FALD with Qdot.


----------



## saltedham

https://youtu.be/6b5AaRW5110?t=1529

not sure if how accurate linus has been but he mentions hes heard it will be out within a few months.


----------



## oc9212

If this is truly delayed to May. I may as well give up getting this monitor  I have waited for over a YEAR now.

Should I just get Acer Predator XB271HK 27", (asus PG27AQ is also good but more expensive) for a year and then get whatever monitor out there as a second monitor (to replace my first one, as I'd like to have dual-monitor setup)?

I don't have the setup to run 4k/60+fps anyway (asus 1080ti), so may as well best ips/4k/gsync monitor out there.

The current monitor I have is Syncmanster 2494HS... it hurts my eyes (I use my PC 12+ hours a day, 7 days a week).

Your thoughts please,


----------



## oc9212

please delete.


----------



## ahmedmo1

What's the rush? Couldn't care less when this comes out. Once I see a 32"+ revision for under $1500 CAD, i'll be interested. Guessing that's ~2 yrs away. It'll likely take at least that long to get a card that can render 4k at high refresh rates.


----------



## Malinkadink

I've actually reconsidered these monitors altogether, the price just isn't going to be justifiable to me until a year or two later when more of these come up especially more affordable freesync versions. The only thing im actually interested in is the 4k and having 144hz still. Recently purchased an OLED and watching HDR content on it in a dark room with 100 OLED Light is uncomfortable when bright highlights are introduced, does it look good? Yeah, but without sitting farther away and having some lights on in the room its just retina burning bright depending on the scenes. Truthfully i'm not fully sold on HDR, i like the wide color gamut it offers, but i dont need my display to ever go above 100 nits in a dark room for anything. Suffice to say i'm content with 8 bit IPS panels for awhile longer with their 16ish million colors, SDR can still look great so i'll be getting an XB271HU this week to replace the S2417DG (gotten sick of the vertical gamma shift and yellowing of whites at the sides when web browsing, oh the contrast ratio is also like 600:1, and being locked into 1 gamma option with the lack of OSD options is becoming a bit off putting as well.)


----------



## feznz

oc9212 said:


> If this is truly delayed to May. I may as well give up getting this monitor  I have waited for over a YEAR now.
> 
> Should I just get Acer Predator XB271HK 27", (asus PG27AQ is also good but more expensive) for a year and then get whatever monitor out there as a second monitor (to replace my first one, as I'd like to have dual-monitor setup)?
> 
> I don't have the setup to run 4k/60+fps anyway (asus 1080ti), so may as well best ips/4k/gsync monitor out there.
> 
> The current monitor I have is Syncmanster 2494HS... it hurts my eyes (I use my PC 12+ hours a day, 7 days a week).
> 
> Your thoughts please,



you need to get out more 

but seriously 27" way too small maybe I am spoiled by a 38' that for some reason seems small till LAN night and I see the perspective


----------



## keikei

Malinkadink said:


> I've actually reconsidered these monitors altogether, the price just isn't going to be justifiable to me until a year or two later when more of these come up especially more affordable freesync versions. The only thing im actually interested in is the 4k and having 144hz still. Recently purchased an OLED and watching HDR content on it in a dark room with 100 OLED Light is uncomfortable when bright highlights are introduced, does it look good? Yeah, but without sitting farther away and having some lights on in the room its just retina burning bright depending on the scenes. Truthfully i'm not fully sold on HDR, i like the wide color gamut it offers, but i dont need my display to ever go above 100 nits in a dark room for anything. Suffice to say i'm content with 8 bit IPS panels for awhile longer with their 16ish million colors, SDR can still look great so i'll be getting an XB271HU this week to replace the S2417DG (gotten sick of the vertical gamma shift and yellowing of whites at the sides when web browsing, oh the contrast ratio is also like 600:1, and being locked into 1 gamma option with the lack of OSD options is becoming a bit off putting as well.)


I'm actually ready to jump on the new LG's when they come out with their nano IPS monitors. It'll be an upgrade for me in regards to size, image quality, connectivity, and movie viewing/gaming experience. Would I rather have OLED? Hell yeah, but I dont have a need for a 55 inch screen as a monitor.


----------



## animeowns

oc9212 said:


> If this is truly delayed to May. I may as well give up getting this monitor  I have waited for over a YEAR now.
> 
> Should I just get Acer Predator XB271HK 27", (asus PG27AQ is also good but more expensive) for a year and then get whatever monitor out there as a second monitor (to replace my first one, as I'd like to have dual-monitor setup)?
> 
> I don't have the setup to run 4k/60+fps anyway (asus 1080ti), so may as well best ips/4k/gsync monitor out there.
> 
> The current monitor I have is Syncmanster 2494HS... it hurts my eyes (I use my PC 12+ hours a day, 7 days a week).
> 
> Your thoughts please,


wow 12 hours a day you must use your pc in a working environment.


----------



## Malinkadink

keikei said:


> I'm actually ready to jump on the new LG's when they come out with their nano IPS monitors. It'll be an upgrade for me in regards to size, image quality, connectivity, and movie viewing/gaming experience. Would I rather have OLED? Hell yeah, but I dont have a need for a 55 inch screen as a monitor.


That's kind of the issue for me with my OLED, i can't feasibly use it as a monitor as its too big and my setup would need a complete overhaul if i planned to sit at a reasonable distance and still use m/kb with a desk. I rarely watch shows but then i do its fantastic, but in the end i prefer sitting at a desk using a monitor, and i can't do that with the OLED, not to mention 60hz is an eyesore. Typically when i get a display its with the intent of using it 24/7 as my main monitor, which makes me wish we had monitor friendly sized oleds with gsync and 144hz.


----------



## CallsignVega

https://twitter.com/TFTCentral/stat...io/iframe/twitter.min.html#968799839143317504

Q3 now which is code for 2019.


----------



## Jbravo33

CallsignVega said:


> https://twitter.com/TFTCentral/stat...io/iframe/twitter.min.html#968799839143317504
> 
> Q3 now which is code for 2019.


Lol so happy I bought dell Alienware aw34


----------



## animeowns

Jbravo33 said:


> Lol so happy I bought dell Alienware aw34


I might be joining the ultrawide race soon if I see the alienware priced at $999 again this waiting is ridiculous and I like the idea of 200hz 3440x1440 but we need that in ips not VA


----------



## NewType88

CallsignVega said:


> https://twitter.com/TFTCentral/stat...io/iframe/twitter.min.html#968799839143317504
> 
> Q3 now which is code for 2019.


Ah man, no ! That's probably when the new TI will launch though.


----------



## keikei

Malinkadink said:


> That's kind of the issue for me with my OLED, i can't feasibly use it as a monitor as its too big and my setup would need a complete overhaul if i planned to sit at a reasonable distance and still use m/kb with a desk. I rarely watch shows but then i do its fantastic, but in the end i prefer sitting at a desk using a monitor, and i can't do that with the OLED, not to mention 60hz is an eyesore. Typically when i get a display its with the intent of using it 24/7 as my main monitor, which makes me wish we had monitor friendly sized oleds with gsync and 144hz.





CallsignVega said:


> https://twitter.com/TFTCentral/stat...io/iframe/twitter.min.html#968799839143317504
> 
> Q3 now which is code for 2019.


Great...That ultra-wide from LG is supposed to launch next month. One site has a pre-order price of $1500. I'm hoping that is a placement holder.


----------



## Sancus

The 38" LG Ultrawide seems to be identical to the existing one except maybe with some kind of(software) hdr support, same panel. Not very interesting.

At this rate, these things won't be out until LG does 40-49" panels in 2020 with HDMI 2.1 and 120hz and we'll all just stop buying high-end LCDs at that point.


----------



## MistaSparkul

Yep given how slow these 27 inch ones are taking to release I can bet that we will see smaller size 120Hz OLED TVs before we ever see a 32 inch version of these FALD displays. I had been holding out for a 32 inch version but looks like I can just skip it entirely and go straight for the high refresh OLED.


----------



## CallsignVega

Once 120 Hz 4K OLED in March 2019 hits there will be no reason for any of these delayed LCD's.


----------



## kot0005

CallsignVega said:


> Once 120 Hz 4K OLED in March 2019 hits there will be no reason for any of these delayed LCD's.


where is this coming from !?!?

wouldn't they also cost like $5k ?


----------



## Malinkadink

kot0005 said:


> where is this coming from !?!?
> 
> wouldn't they also cost like $5k ?


2017 sets support 120hz 1080p, surely 2019 sets will support 4k 120hz as they will come with HDMI 2.1 enabling them to do that.


----------



## feznz

Sancus said:


> The 38" LG Ultrawide seems to be identical to the existing one except maybe with some kind of(software) hdr support, same panel. Not very interesting.
> 
> At this rate, these things won't be out until LG does 40-49" panels in 2020 with HDMI 2.1 and 120hz and we'll all just stop buying high-end LCDs at that point.


yeah I got the LG 38UC99w there is a huge difference the stand is totally different mine is white and silver the new 38 is just silver  seriously same panel so what could be so much better......

would take a OLED @ 120hz 40"+ I just couldn't go back to smaller


----------



## toncij

CallsignVega said:


> https://twitter.com/TFTCentral/stat...io/iframe/twitter.min.html#968799839143317504
> 
> Q3 now which is code for 2019.


Q4 for 35" - that's like code for Summer 2019.  

These monitors seem to be unrolling to an eventual cancellation and one more major failure for Asus (after that 32" ProArwt that just disappeared).

::Son I am dissapoint meme::

Anyway, time to think about LG nano 34" 5K (and some non-integer scaling) if it ever comes out or a 38" or not sure...


----------



## oc9212

animeowns said:


> wow 12 hours a day you must use your pc in a working environment.


I'm a writer, so that 8-12 hours a day + gaming/browsing/anything else needed = ......there you go 

I bought the Acer monitor and I'm happy with it (just got it today actually).
I bet these monitors won't be out until 2019


----------



## sblantipodi

CallsignVega said:


> https://twitter.com/TFTCentral/stat...io/iframe/twitter.min.html#968799839143317504
> 
> Q3 now which is code for 2019.


hope that they are working on the haloing problem, in other cases there is no reason to postpone it.


----------



## Korruptive

What a joke, I've been waiting to get a new monitor for over a year.

All I want is 4k, 100+Hz, 27-32" and OLED, but there is not even any rumors of OLED appearing soon so I'll have to go with LCD.


----------



## Malinkadink

Korruptive said:


> What a joke, I've been waiting to get a new monitor for over a year.
> 
> All I want is 4k, 100+Hz, 27-32" and OLED, but there is not even any rumors of OLED appearing soon so I'll have to go with LCD.


I went with two 27" 1440p 144hz IPS monitors, not interested in HDR, and 4k at 144hz would still be great, but im not going to pay over $1500+ for it when its still an LCD, if it were OLED that'd be a different story. I too really want a 4k 144hz OLED monitor in the 27-32 inch range. I have a 55 inch OLED but its not feasible to use as a monitor due to its size and its only 60hz so aside from casual gaming or movie watching its pretty useless for everything else.


----------



## Sancus

Korruptive said:


> What a joke, I've been waiting to get a new monitor for over a year.
> 
> All I want is 4k, 100+Hz, 27-32" and OLED, but there is not even any rumors of OLED appearing soon so I'll have to go with LCD.


Unfortunately no OLED makers seem interested in making panels that size, with the possible exception of JOLED. Will have to see if the Asus ProArt PQ22UC a) makes it to market and b) is considered a success by its manufacturer. Then we might have a chance at a larger size in the next couple of years.


----------



## Aussiejuggalo

So we wont be seeing these till Q3... utterly useless. Guess they'll be released when Nvidia get around to putting out a new card .


----------



## mmms

What about upcoming Mini LED gaming monitor ?

https://news.cnyes.com/news/id/4057915


----------



## animeowns

keikei said:


> Great...That ultra-wide from LG is supposed to launch next month. One site has a pre-order price of $1500. I'm hoping that is a placement holder.


its not a placement holder 5k cost a premium. but its a good price


----------



## animeowns

Jbravo33 said:


> Lol so happy I bought dell Alienware aw34


how is it in comparison to 4k I am coming from a pg279q thinking about either the acer x34p or the alienware aw34


----------



## ryan92084

mmms said:


> What about upcoming Mini LED gaming monitor ?
> 
> https://news.cnyes.com/news/id/4057915


Did you mean to post the same question in two different threads that have nothing to do with your link? I thought my response just didn't post but no it was just because I replied elsewhere.

Regardless it is still just an LCD panel with local dimming but with a new misleading buzz word.


----------



## Baasha

CallsignVega said:


> https://twitter.com/TFTCentral/stat...io/iframe/twitter.min.html#968799839143317504
> 
> Q3 now which is code for 2019.



So pathetic. They announced in Jan. 2017. LOL.. might as well be vaporware.

Glad I still have my 1440P 144Hz Asus RoG Swift monitor as a backup. Thought I find myself playing on the 4K OLED monitor a LOT more these days.


----------



## Aussiejuggalo

Baasha said:


> Thought I find myself playing on the 4K OLED monitor a LOT more these days.


Love how everyone keeps going on about OLED, it has flaws like burn in (which is major for PC's). The price will also be a massive problem, for these 4K IPS ones it's rumoured to be what $2000 USD? OLED could easily be 2 - 4x that, just look at the Sony Pro line up of OLED monitors, the cheapest is $3695 and that's 1080p (granted it's 10-Bit).

OLED would be good if the price wasn't extreme and it didn't have burn in problems but because of that it makes it pointless for 99% of people who aren't loaded, IPS is still the way to go for now.

As for these IPS monitors, have they also been delayed to coincide with the launch of Nvidia's new cards?


----------



## Scotty99

OLED really is the future tho, costs will come down eventually. What i dont get is why samsung is so against OLED, i imagine profit margins for them would be lower so lets push inferior tech lol?


----------



## Sancus

Aussiejuggalo said:


> Love how everyone keeps going on about OLED, it has flaws like burn in (which is major for PC's). The price will also be a massive problem, for these 4K IPS ones it's rumoured to be what $2000 USD? OLED could easily be 2 - 4x that, just look at the Sony Pro line up of OLED monitors, the cheapest is $3695 and that's 1080p (granted it's 10-Bit).


Baasha has one of the very UP3017Qs that was sold. I'm sure if it had burn-in he would've mentioned it by now. It's been almost a year now, I believe.

With how stunted the monitor market continues to be, I'm starting to think I should have bought one myself.


----------



## Baasha

Aussiejuggalo said:


> Love how everyone keeps going on about OLED, it has flaws like burn in (which is major for PC's). The price will also be a massive problem, for these 4K IPS ones it's rumoured to be what $2000 USD? OLED could easily be 2 - 4x that, just look at the Sony Pro line up of OLED monitors, the cheapest is $3695 and that's 1080p (granted it's 10-Bit).
> 
> OLED would be good if the price wasn't extreme and it didn't have burn in problems but because of that it makes it pointless for 99% of people who aren't loaded, IPS is still the way to go for now.
> 
> As for these IPS monitors, have they also been delayed to coincide with the launch of Nvidia's new cards?





Sancus said:


> Baasha has one of the very UP3017Qs that was sold. I'm sure if it had burn-in he would've mentioned it by now. It's been almost a year now, I believe.
> 
> With how stunted the monitor market continues to be, I'm starting to think I should have bought one myself.


That's right - I have the UP3017Q and it is an absolute beauty - not a single blemish on it. Not sure why Dell stopped selling them so quickly. 32" OLED 4K 144Hz w/ G-Sync would be phenomenal. However, I'll happily get the 4K 144Hz IPS if/when it does come out.

In fact, I've delayed my X299 build to coincide with the release of that monitor. I'm quite happy with my other monitors for now.


----------



## Glerox

Baasha said:


> That's right - I have the UP3017Q and it is an absolute beauty - not a single blemish on it. Not sure why Dell stopped selling them so quickly. 32" OLED 4K 144Hz w/ G-Sync would be phenomenal. However, I'll happily get the 4K 144Hz IPS if/when it does come out.
> 
> In fact, I've delayed my X299 build to coincide with the release of that monitor. I'm quite happy with my other monitors for now.


ThirtyIR, what about your 8k monitor? Not gaming on it anymore? 
With a custom SLI profile you could run FF XV in glorious 8k on your 4 Titan Xps.

hahaha I love overkill


----------



## ryan92084

Baasha said:


> That's right - I have the UP3017Q and it is an absolute beauty - not a single blemish on it. *Not sure why Dell stopped selling them so quickly*. 32" OLED 4K 144Hz w/ G-Sync would be phenomenal. However, I'll happily get the 4K 144Hz IPS if/when it does come out.
> 
> In fact, I've delayed my X299 build to coincide with the release of that monitor. I'm quite happy with my other monitors for now.


Not necessarily the reason but this is what Mark of blurbusters has to say on the the Dell


> In the default (120Hz flicker at 60Hz, double-strobe) mode, the Dell U3017Q OLED computer monitor had a double-image motion artifact. You had to go down to single-strobe 60Hz (very flickery) to get only a single image.


http://www.avsforum.com/forum/40-ol...ogy-advancements-thread-484.html#post55547832


> Witness the Dell UP3017Q OLED computer monitor, the most beautiful 4K OLED my eyes has ever feasted upon. Stunning. If you disabled its "reduced flicker" mode, it went into a 60Hz CRT like large-black-duty cycle mode! It even had less motion blur than any 2017 or 2018 OLED HDTV, or any of the new OLED HDTVs that I saw at CES. The motion clarity was plasmalike!! It used really large BFI ratios!! There was one huge flaw, however. Imagine staring 2 feet away from a 30" CRT that is now your desktop monitor. *Eye searing 60 Hz flicker*. Dell discontinued the monitor.


http://www.avsforum.com/forum/40-ol...ogy-advancements-thread-485.html#post55594280


----------



## animeowns

Glerox said:


> ThirtyIR, what about your 8k monitor? Not gaming on it anymore?
> With a custom SLI profile you could run FF XV in glorious 8k on your 4 Titan Xps.
> 
> hahaha I love overkill


he got rid of the 8k monitor its too much of a headache with the current hardware out now we need better cards in order to take advantage of the massive bandwidth 8k requires.


----------



## Baasha

Glerox said:


> ThirtyIR, what about your 8k monitor? Not gaming on it anymore?
> With a custom SLI profile you could run FF XV in glorious 8k on your 4 Titan Xps.
> 
> hahaha I love overkill


I sold the 8K monitor - MST monitor plus given the fact that 4x Titan Xp barely allows for 60fps gaming at 8K in most modern games sealed the decision for me.


----------



## Glerox

Aight I understand! At least you had a glimpse of the future


----------



## CallsignVega

Aussiejuggalo said:


> Love how everyone keeps going on about OLED, it has flaws like burn in (which is major for PC's). The price will also be a massive problem, for these 4K IPS ones it's rumoured to be what $2000 USD? OLED could easily be 2 - 4x that, just look at the Sony Pro line up of OLED monitors, the cheapest is $3695 and that's 1080p (granted it's 10-Bit).
> 
> OLED would be good if the price wasn't extreme and it didn't have burn in problems but because of that it makes it pointless for 99% of people who aren't loaded, IPS is still the way to go for now.
> 
> As for these IPS monitors, have they also been delayed to coincide with the launch of Nvidia's new cards?


I've been gaming on OLED for years and have had exactly zero issues with burn in. Only affects people who are stupid enough to crank the brightness to max and watch a CNN logo all day. 

As for price, the new 2018 55" C8 OLED is launching at $2,500. A whopping $500 more than the launch price of this silly 27" LCD.


----------



## Scotty99

So this is for sure still 2 grand lol?

Im looking for something slightly different, id want a 24" 240hz TN with local dimming at 1440p. Given TN's "bad name" we sadly will likely never see this tho


----------



## ryan92084

CallsignVega said:


> I've been gaming on OLED for years and have had exactly zero issues with burn in. Only affects people who are stupid enough to crank the brightness to max and watch a CNN logo all day.
> 
> As for price, the new 2018 55" C8 OLED is launching at $2,500. A whopping $500 more than the launch price of this silly 27" LCD.


Hah, too true. B8 might even launch cheaper than this monitor. Too bad no VRR or 120hz input (just native apps) until maybe 2019.


----------



## Malinkadink

Scotty99 said:


> So this is for sure still 2 grand lol?
> 
> Im looking for something slightly different, id want a 24" 240hz TN with local dimming at 1440p. Given TN's "bad name" we sadly will likely never see this tho


They'll do a 240hz 1440p 24" TN but it wont have local dimming. If they had local dimming it'd be IPS, and im not sure they could push IPS pixels fast enough to have proper 240hz so 240hz is pretty much exclusive to TN until we get OLED monitors, if we even get those, because it's looking more like we'll just keep riding this LCD train until micro LED.


----------



## boredgunner

Malinkadink said:


> They'll do a 240hz 1440p 24" TN but it wont have local dimming. If they had local dimming it'd be IPS, and im not sure they could push IPS pixels fast enough to have proper 240hz so 240hz is pretty much exclusive to TN until we get OLED monitors, if we even get those, because it's looking more like we'll just keep riding this LCD train until micro LED.


Once we get sub 50" 4k 120 Hz OLED TVs I'm jumping off that train.


----------



## Scotty99

Malinkadink said:


> They'll do a 240hz 1440p 24" TN but it wont have local dimming. If they had local dimming it'd be IPS, and im not sure they could push IPS pixels fast enough to have proper 240hz so 240hz is pretty much exclusive to TN until we get OLED monitors, if we even get those, because it's looking more like we'll just keep riding this LCD train until micro LED.


Why wouldnt they do local dimming? Nothing inherent about TN that would prohibit that, and its poor native contrast (just like ips) would be a great candidate for the tech.


----------



## boredgunner

Scotty99 said:


> Why wouldnt they do local dimming? Nothing inherent about TN that would prohibit that, and its poor native contrast (just like ips) would be a great candidate for the tech.


I assume they'd never do it because those buying TN generally don't care about image quality, and those who do care about image quality and want FALD do not want TN. I doubt this mindset will ever disappear from the industry and it's understandable. FALD ain't gonna make TN look good (nor IPS it seems).


----------



## Scotty99

Well i see it different, TN is a superior gaming technology it just has worse viewing angles than IPS (who cares?). I think a properly local dimmed 240hz 1440p tn is the best gaming monitor you can make, barring oled. Im not needing 4k and in fact would rather avoid it until GPU's get much faster.

I think the major problem is again TN being a "bad word". If you can put a functioning local dimming on a 240hz tn what negatives does this panel have? Color gamut is not limited by panel technology (although many think it is).


----------



## boredgunner

Scotty99 said:


> Well i see it different, TN is a superior gaming technology it just has worse viewing angles than IPS (who cares?). I think a properly local dimmed 240hz 1440p tn is the best gaming monitor you can make, barring oled. Im not needing 4k and in fact would rather avoid it until GPU's get much faster.


I know what it's like to want something that will never be made. It sucks. We will all have to compromise until high refresh rate OLED or microLED.


----------



## Scotty99

Ya you are likely right, but its frustrating to say the least as i feel TN is simply a better technology for use as a "gaming" monitor. It all trickles down from the TV makers i suppose, and TN there is obviously not acceptable due to viewing angles. I do think tho if asus or acer or dell came out with something like this it could change the perception of TN panels, at least enthusiasts.


----------



## ZealotKi11er

Scotty99 said:


> Well i see it different, TN is a superior gaming technology it just has worse viewing angles than IPS (who cares?). I think a properly local dimmed 240hz 1440p tn is the best gaming monitor you can make, barring oled. Im not needing 4k and in fact would rather avoid it until GPU's get much faster.
> 
> I think the major problem is again TN being a "bad word". If you can put a functioning local dimming on a 240hz tn what negatives does this panel have? Color gamut is not limited by panel technology (although many think it is).


TN just does not have the colours and even with local dimming, you need a minimum of VA panel. Also nobody is going to pay that kind of money for TN screen no matter how good it is. You really have to try OLED and see real 8 Million zones local dimming.


----------



## Scotty99

Bro i know how OLED looks and i know what a properly done FALD looks like lol. You need to understand something, vizio makes a 55" TV that is just as good as VA in their P55:
https://www.vizio.com/p55c1.html

THAT is an IPS tv, there is nothing stopping someone from making a beautiful 24" 1440p 240hz locally dimmed TN panel, it has the same terrible native contrast as IPS does. I dont know how the "TN has bad colors" thing got around or how people seem to think panel type has ANYTHING to with colors but it does not, you can make a TN panel high quality enough to have the same color gamut as an IPS tv.


----------



## Sancus

Scotty99 said:


> Bro i know how OLED looks and i know what a properly done FALD looks like lol. You need to understand something, vizio makes a 55" TV that is just as good as VA in their P55:
> https://www.vizio.com/p55c1.html
> 
> THAT is an IPS tv, there is nothing stopping someone from making a beautiful 24" 1440p 240hz locally dimmed TN panel


There are lots of problems with FALD on PC. So far no one has produced a FALD with good latency, which is a major issue for video games where your viewport can change very quickly. Nor have they produced a FALD with small enough zones that haloing isn't a major problem. Even the PG27UQ has visible haloing based on reports of people who have seen it at shows.

This idea that FALD is the solution is proving not to be true -- if it was easy to make a FALD LCD to the standards required for PC, then we wouldn't be looking at 2+ year delays on all the gaming FALD monitors.


----------



## Scotty99

Im the first one who told people that LOL. Seriously browse the start of this thread, i TOLD people watch out for the fald, its incredibly hard to do right. Look at where we are now, pushed back 3-4 times and my guess is because of FALD. That is not to say it cant be done right, obviously it can when you see the implementations on vizio and sony TV's. They just dont have the resources the sonys and vizios of the world do.


----------



## Sancus

I don't understand your point. You keep pointing at TVs, but they all have 1-2 orders of magnitude too few zones to be useful for their screen size. The Vizios don't look good at all.

E: Just so we're clear on what I mean by "don't look at good at all" the Vizio P-series, like literally every FALD TV with fewer than thousands or tens of thousands of zones, has extremely evident haloing even on extremely large, slow, smoothly moving objects. as you can see in this video from the rtings review

I'm not aware of anyone who tests FALD latency on TVs, but based on that, it's no better than the 600ms of the Dell UP2718Q. People just don't notice poor FALD latency as much with video content.


----------



## Scotty99

Sancus said:


> I don't understand your point. You keep pointing at TVs, but they all have 1-2 orders of magnitude too few zones to be useful for their screen size. The Vizios don't look good at all.


Are you goofy in the head lol? The vizio P series was ranked the best tv of 2016, and the sony z9d won the honor in 2017 for their FALD implementation. Talking about LED tv's, of course.

Of course i am pointing at TV's, what other reference points do we have? I dont disagree with the notion that more zones=better, but saying the vizio P series "dont look good at all" is one of the craziest things ive ever read on a forum.


----------



## CallsignVega

31" 4K OLED panel announced for 2019. Pair that with DP 1.4 and no reason it couldn't be 120 Hz. 

https://www.oled-info.com/johua-printing-developed-ink-jet-printed-31-4k-oled-panel

And a sweet VR panel:

https://www.oled-info.com/google-developed-1443-ppi-43-120hz-vr-amoled-display


The future is looking very bright for OLED. (pun intended).


----------



## MistaSparkul

CallsignVega said:


> 31" 4K OLED panel announced for 2019. Pair that with DP 1.4 and no reason it couldn't be 120 Hz.
> 
> https://www.oled-info.com/johua-printing-developed-ink-jet-printed-31-4k-oled-panel
> 
> And a sweet VR panel:
> 
> https://www.oled-info.com/google-developed-1443-ppi-43-120hz-vr-amoled-display
> 
> 
> The future is looking very bright for OLED. (pun intended).


But actual monitors using this panel won't be out till at least 2020


----------



## ttnuagmada

Combining IPS and FALD is going to have mixed results, especially when the screen is going to be a couple of feet from your face. There is a reason that Sony/Samsung/Vizio use VA almost exclusively, especially on their high-end sets. Even with FALD, the panel needs a decent native contrast, otherwise blooming will be glaringly obvious.


----------



## ttnuagmada

Sancus said:


> I don't understand your point. You keep pointing at TVs, but they all have 1-2 orders of magnitude too few zones to be useful for their screen size. The Vizios don't look good at all.
> 
> E: Just so we're clear on what I mean by "don't look at good at all" the Vizio P-series, like literally every FALD TV with fewer than thousands or tens of thousands of zones, has extremely evident haloing even on extremely large, slow, smoothly moving objects. as you can see in this video from the rtings review
> 
> I'm not aware of anyone who tests FALD latency on TVs, but based on that, it's no better than the 600ms of the Dell UP2718Q. People just don't notice poor FALD latency as much with video content.


Actually, if you read the review you stole that video from, they'll pretty much contradict everything you just said about the FALD quality of the P series.


----------



## Sancus

ttnuagmada said:


> Actually, if you read the review you stole that video from, they'll pretty much contradict everything you just said about the FALD quality of the P series.


No, it doesn't. It says it "does a good job of limiting blooming" -- sure, for a TV with zones as big as an entire phone screen. Rtings does not even test FALD latency at all. It's not nearly as obvious with video content as it is when you are moving your mouse pointer around a dark screen and having a 500ms delayed bloom following it around. TV FALDs are nowhere close to the standard necessary for a PC monitor, and nobody reviews them as if they were. They compare them to other LCD TVs alone, because if you compare them to any other technology they look objectively poor. 

But hey, if people really want FALDs that are about as good as the TV ones, there are two FALD monitors out right now that you can purchase that are just as good if not better than any LCD TV available. TV FALDs aren't any better than anything monitor manufacturers can come up with, FALDs are just not very good across the board.


----------



## Scotty99

Sancus said:


> No, it doesn't. It says it "does a good job of limiting blooming" -- sure, for a TV with zones as big as an entire phone screen. Rtings does not even test FALD latency at all. It's not nearly as obvious with video content as it is when you are moving your mouse pointer around a dark screen and having a 500ms delayed bloom following it around. TV FALDs are nowhere close to the standard necessary for a PC monitor, and nobody reviews them as if they were. They compare them to other LCD TVs alone, because if you compare them to any other technology they look objectively poor.
> 
> But hey, if people really want FALDs that are about as good as the TV ones, there are two FALD monitors out right now that you can purchase that are just as good if not better than any LCD TV available. TV FALDs aren't any better than anything monitor manufacturers can come up with, FALDs are just not very good across the board.


Many people use the vizio P as a PC monitor and never have i seen someone complain about the latency of the fald....it is a 120hz native panel btw.

You have never used a FALD tv that much is clear, and probably only started researching the subject when you saw these FALD monitors announced. You are correct that more zones=better, but to get to a point even close to OLED you are going to surpass the costs of OLED in the process. I have a cheap vizio in the E series, even that model does an excellent job with a meager 12 zones. Of course there is blooming but what FALD does besides helping out with contrast is screen uniformity, on a fully dark screen my 500 dollar tv has almost perfect uniformity, something that cannot be said for edge lit samsung panels costing 3-4x the amount.

The FALD you envision with thousands of zones is never going to happen, like i said you are well above OLED costs by then.


----------



## ttnuagmada

Sancus said:


> No, it doesn't. It says it "does a good job of limiting blooming" -- sure, for a TV with zones as big as an entire phone screen. Rtings does not even test FALD latency at all. It's not nearly as obvious with video content as it is when you are moving your mouse pointer around a dark screen and having a 500ms delayed bloom following it around. TV FALDs are nowhere close to the standard necessary for a PC monitor, and nobody reviews them as if they were. They compare them to other LCD TVs alone, because if you compare them to any other technology they look objectively poor.


Lol. First of all guy, a 500ms FALD lag would be beyond obvious. Just a simple scene change would make it glaringly obvious, as you would get super-noticable brightness pops going from a dark scene to a light scene. On top of that, rtings literally tests their televisions as PC monitors.

Have you ever been to AVSforum? I suggest you pick a thread on a popular TV model. You'll quickly see how every single TV that comes out gets scrutinized by owners in ways you would have never even thought to scrutinize something. If the Vizio P (or any FALD set) had a backlight that was 500ms behind the actual picture, someone would have noticed it almost immediately. Someone would have noticed if it was even 50ms behind. Seriously, just pick any thread on any model of a popular television. 

Oddly enough, i own a Panasonic plasma that had very quick and subtle brightness pops when it shipped. It was noticed by someone almost immediately, and Panasonic replaced peoples A-boards to fix it. a 500 ms lag between the backlight and the panel itself would be spotted day 1 by someone. 





> FALDs are just not very good across the board.


 lets say that FALD lag is an actual thing, that for some reason, no owner has ever noticed, but was suddenly discovered by a random guy on OCN who has never even seen an FALD in person. They would simply sync up the backlight with the picture via firmware update. I mean do you think they're just slapping FALD backlighting in these TV's and hoping it syncs up with the picture? Seriously?


----------



## frunction

https://www.hardwareluxx.de/index.p...-sync-wohl-erst-im-3-oder-4-quartal-2018.html

update:

At least from ASUS we have now received a response. According to him, the ROG PG27UQ with UHD resolution, 144 Hz and G-Sync HDR will finally be available in May. However, there is still no date for the ROG PG35UQ and so the second half of 2018 will continue to be mentioned.


----------



## toncij

frunction said:


> https://www.hardwareluxx.de/index.p...-sync-wohl-erst-im-3-oder-4-quartal-2018.html
> 
> update:
> 
> At least from ASUS we have now received a response. According to him, the ROG PG27UQ with UHD resolution, 144 Hz and G-Sync HDR will finally be available in May. However, there is still no date for the ROG PG35UQ and so the second half of 2018 will continue to be mentioned.


One more unverifiable info...

Anyway, PA32UC it's out and FALD seems to be a miss.


----------



## CallsignVega

No one runs any FALD TV's with a PC with the FALD enabled. The input lag would be atrocious. Almost all game modes/PC modes on TV's turn their FALD off.


----------



## Scotty99

CallsignVega said:


> No one runs any FALD TV's with a PC with the FALD enabled. The input lag would be atrocious. Almost all game modes/PC modes on TV's turn their FALD off.


Im beginning to think literally no one on this forum has ever used a FALD tv, they just spout off things they think is correct from a google search on fald lol.

Everyone uses fald on as a PC monitor (you would have to be crazy to turn it off, thats the main selling point of these TV's), what you lose is HDR support (because that port bypasses the chip) not fald on the 120hz inputs.

From 
https://www.neogaf.com/threads/game...ut-lag-worsens-picture-quality.1324844/page-3
Nope. Vizio P55 run game mode properly calibrated and local dimming works perfect. Use same setting as all my gear runs to HDMI 5 on my set thru my AVR.


----------



## animeowns

frunction said:


> https://www.hardwareluxx.de/index.p...-sync-wohl-erst-im-3-oder-4-quartal-2018.html
> 
> update:
> 
> At least from ASUS we have now received a response. According to him, the ROG PG27UQ with UHD resolution, 144 Hz and G-Sync HDR will finally be available in May. However, there is still no date for the ROG PG35UQ and so the second half of 2018 will continue to be mentioned.


thank you now that I have a set date just wait until may and pre order as soon as I can on pg27uq


----------



## keikei

^Oh man, that is some good news. I may hold out for a larger display though as I'm constantly finding myself scrounging for desktop space when browsing.


----------



## Malinkadink

keikei said:


> ^Oh man, that is some good news. I may hold out for a larger display though as I'm constantly finding myself scrounging for desktop space when browsing.


Are you on 1080p? If you're on 1440p nothing will really change with the 4k 27" as you'll need to use 150% scaling which will be like 1440p 27" for the display to even be usable on the desktop as otherwise everything will be far too small.


----------



## CallsignVega

Scotty99 said:


> Im beginning to think literally no one on this forum has ever used a FALD tv, they just spout off things they think is correct from a google search on fald lol.
> 
> Everyone uses fald on as a PC monitor (you would have to be crazy to turn it off, thats the main selling point of these TV's), what you lose is HDR support (because that port bypasses the chip) not fald on the 120hz inputs.
> 
> From
> https://www.neogaf.com/threads/game...ut-lag-worsens-picture-quality.1324844/page-3
> Nope. Vizio P55 run game mode properly calibrated and local dimming works perfect. Use same setting as all my gear runs to HDMI 5 on my set thru my AVR.


You are in over your head. Pretty hilarious for someone to be quoting some random people talking on a fairly obscure forum from 2016 like they've found the encyclopedia britannica. 

https://www.rtings.com/tv/reviews/vizio/p-series-xled-2017

The only times the P55 is considered to have low input lag is [email protected]/120 Hz and 4K at 60 Hz with reduced chroma. Both are terrible modes for PC gaming and in no way compares to a PC monitor. All other modes have horrid 50+ ms input lag. 

That's not even going into having only 128 FALD zones is really bad in the first place. There are no 4K FALD TV's that run with FALD active and 4K/60 Hz 4:4:4 chroma and/or HDR with acceptable (sub 30ms) input lag.


----------



## Scotty99

CallsignVega said:


> You are in over your head. Pretty hilarious for someone to be quoting some random people talking on a fairly obscure forum from 2016 like they've found the encyclopedia britannica.
> 
> https://www.rtings.com/tv/reviews/vizio/p-series-xled-2017
> 
> The only times the P55 is considered to have low input lag is [email protected]/120 Hz and 4K at 60 Hz with reduced chroma. Both are terrible modes for PC gaming and in no way compares to a PC monitor. All other modes have horrid 50+ ms input lag.
> 
> That's not even going into having only 128 FALD zones is really bad in the first place. There are no 4K FALD TV's that run with FALD active and 4K/60 Hz 4:4:4 chroma and/or HDR with acceptable (sub 30ms) input lag.


What lol? Your claim was fald did not work in game mode, clearly that is not the case. None of you people in this thread know how fald works on TV's, you only started researching fald when you heard some monitors were coming out with it. If anyone is "in over their head" its the people spouting nonsense about fald without owning and researching TV's using the tech. 

1. Its not easy to do, only a couple manufacturers have even given an acceptable version of this. A lot of this comes down to the coding of the fald, as software updates can and have fixed the fald implementations on those TV's.
2. The people with champagne wishes and beer budgets are never getting their 15000 zone fald monitors, its far too expensive.


----------



## CallsignVega

Game mode is worthless for PC gaming monitor use if it completely gimps the display. Which it does on those Walmart Vizio TV's.

It's quite hilarious that you think you are the only one that "understands" something as simple as FALD. FALD is and has always been garbage. Most people haven't used FALD TV's for computer use because OLED does a far superior job, while keeping input lag in check.


----------



## Scotty99

CallsignVega said:


> Game mode is worthless for PC gaming monitor use if it completely gimps the display. Which it does on those Walmart Vizio TV's.
> 
> It's quite hilarious that you think you are the only one that "understands" something as simple as FALD. FALD is and has always been garbage. Most people haven't used FALD TV's for computer use because OLED does a far superior job, while keeping input lag in check.



You need to use game mode on TV's if you want to get down to reasonable latency numbers, and contrary to your belief FALD is not disabled in game mode lol.

I am the only one that understands FALD apparently, everyone else in this thread has made ridiculous assertions about the tech. OLED is obviously superior that isnt what this discussion is about, its about what FALD can and cant do and is likely the reason these monitors are being pushed back......to do it right takes smart people not only designing the backlight system, but even brighter people doing the algorithms to control it.


----------



## CallsignVega

It is entirely relevant. No one who knows what they are doing is going to buy a 4K TV to be forced to run 1080p or reduced chroma exclusively just to get tolerable input lag via game mode. Who gives a crap if lowly 128 zone FALD is working or not if your picture quality is garbage. 

It may not be all, but the vast majority of FALD TV's local dimming is turned off to get "game mode" acceptable levels of input lag since it bypasses the image processor. I've not only owned a FALD TV (JS9500), but a FALD monitor as well (Dell). FALD on TV's has only started to become fast enough in game mode on a few TV's after they've been made completely irrelevant by OLED TV's (unless price is a huge factor). [OLED] TV's which can keep input lag low while displaying an incredibly better picture and with full chroma.

These monitors are taking so long to come out because they realize FALD sucks. The recently released and reviewed 32" ASUS FALD, a 2018 display that will likely be the panel in the later this year G-Sync display, has FALD that sucks.


----------



## Scotty99

There are only two decent FALD tv's on the market, and the JS9500 is not one of them (sammy has never made a good fald tv). Those are the vizio P series and the sony z9d. Both of these sets are able to keep FALD active while running in game mode. I think you are confusing the 120hz 1080p input with game mode, those ports do indeed bypass the SOC but the fald must be controlled by a different (proprietary chip). If this wasnt the case no one would be using these TV's hooked to a HTPC/steam box, they would look horrid without an active fald during gaming.

FALD does not suck, its a major boon to LED TV's, its just hard to do right and can be very expensive to the point where they have to limit the number of zones or risk encroaching on OLED pricing.


----------



## CallsignVega

You have to use HDMI port 5 on the Vizio P series to get the acceptable game mode input lag. This port cannot do HDR, cannot do 4:4:4 chroma. You have to gimp your image on this TV to use it properly on a computer. That's how the whole topic started, can you use a FALD TV as a computer monitor with low input lag. Since 4:2:0 chroma as computer monitor use looks worse than Rosie O'donnell, I think most would disqualify it from meeting that criteria. 

This is my testing with 384 zone FALD:











It was absolutely atrocious. Even the mighty Z9D with over 600 zones (which is actually just as expensive as OLED) had significant blooming/halo'ing when I viewed it multiple times at Magnolia. 

I think time will tell if these G-Sync FALD's are a disaster. I do know that starting off with 27" 4K was a bad decision.


----------



## Scotty99

What does that have to do with anything lol? This topic is about fald implementations not vizio tv's, it went that direction as the vizio p series is one of the only TV's on the market that has an acceptable fald.

You guys really are wanting too much from FALD from what i see, its going to have imperfections its not oled....but it still brings awesome benefits to LED panels that make them overall better products.


----------



## ryan92084

One of the big problems I see with newer FALD (I own an old Toshiba Regza) is that it really doesn't play nice with HDR for monitors. This was especially evident in the dell where you couldn't separate the 2 in the settings and it locked out the brightness settings. FALD only works well when operating at a reasonable brightness imo
http://www.overclock.net/forum/26550480-post1199.html and http://www.overclock.net/forum/26550979-post1220.html gross


----------



## friendlys

to each his own. Obviously based on your emblem next to your name your a big advocate of oled. I went with the P65 in 2016 instead of a 4k oled because at the time there wasn't 1080p 120 fps available, ABL, Screen burn in, and that extra white pixel which makes your display wrgb not rgb. BTW, I can play 444 1080p 120fps hdr pc games on input 1 to 4 with little to no lag. The input lag number was never tested by rtings. Remember, this was purchased in 2016. In addition, Vizio P series has an option called clear action which is basically strobing and is 100% undetectable @ 120hz and results in zero blurring. The way I look at it, there is no perfect pc gaming tv yet but it may come in 2019 with the implementation of hdmi 2.1


----------



## friendlys

CallsignVega said:


> You have to use HDMI port 5 on the Vizio P series to get the acceptable game mode input lag. This port cannot do HDR, cannot do 4:4:4 chroma. You have to gimp your image on this TV to use it properly on a computer. That's how the whole topic started, can you use a FALD TV as a computer monitor with low input lag. Since 4:2:0 chroma as computer monitor use looks worse than Rosie O'donnell, I think most would disqualify it from meeting that criteria.
> 
> This is my testing with 384 zone FALD:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It was absolutely atrocious. Even the mighty Z9D with over 600 zones (which is actually just as expensive as OLED) had significant blooming/halo'ing when I viewed it multiple times at Magnolia.
> 
> I think time will tell if these G-Sync FALD's are a disaster. I do know that starting off with 27" 4K was a bad decision.


to each his own. Obviously based on your emblem next to your name your a big advocate of oled. I went with the P65 in 2016 instead of a 4k oled because at the time there wasn't 1080p 120 fps available, ABL, Screen burn in, and that extra white pixel which makes your display wrgb not rgb. BTW, I can play 444 1080p 120fps hdr pc games on input 1 to 4 with little to no lag. The input lag number was never tested by rtings. Remember, this was purchased in 2016. In addition, Vizio P series has an option called clear action which is basically strobing and is 100% undetectable @ 120hz and results in zero blurring. The way I look at it, there is no perfect pc gaming tv yet but it may come in 2019 with the implementation of hdmi 2.1


----------



## ToTheSun!

friendlys said:


> to each his own. Obviously based on your emblem next to your name your a big advocate of oled. I went with the P65 in 2016 instead of a 4k oled because at the time there wasn't 1080p 120 fps available, ABL, Screen burn in, and that extra white pixel which makes your display wrgb not rgb. BTW, I can play 444 1080p 120fps hdr pc games on input 1 to 4 with little to no lag. The input lag number was never tested by rtings. Remember, this was purchased in 2016. In addition, Vizio P series has an option called clear action which is basically strobing and is 100% undetectable @ 120hz and results in zero blurring. The way I look at it, there is no perfect pc gaming tv yet but it may come in 2019 with the implementation of hdmi 2.1


I agree with you in that there is no "perfect" TV for PC usage on the market (the one coming the closest in my opinion not being relevant to my post), but to insinuate that LG's WRGB structure for their OLED TV's is a point against the latter is disingenuous. It is not even remotely similar to the WRGB structure in their IPS TV's, and it doesn't have its inherent flaws.


----------



## friendlys

CallsignVega said:


> No one runs any FALD TV's with a PC with the FALD enabled. The input lag would be atrocious. Almost all game modes/PC modes on TV's turn their FALD off.


False, the vizio p series input lag increases less than 1 ms according to the former CTO of vizio who was a member of avs forum.


----------



## boredgunner

CallsignVega said:


> You have to use HDMI port 5 on the Vizio P series to get the acceptable game mode input lag. This port cannot do HDR, cannot do 4:4:4 chroma. You have to gimp your image on this TV to use it properly on a computer. That's how the whole topic started, can you use a FALD TV as a computer monitor with low input lag. Since 4:2:0 chroma as computer monitor use looks worse than Rosie O'donnell, I think most would disqualify it from meeting that criteria.
> 
> This is my testing with 384 zone FALD:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It was absolutely atrocious. Even the mighty Z9D with over 600 zones (which is actually just as expensive as OLED) had significant blooming/halo'ing when I viewed it multiple times at Magnolia.
> 
> I think time will tell if these G-Sync FALD's are a disaster. I do know that starting off with 27" 4K was a bad decision.


Thanks for the photo demonstration as usual. Yeah, FALD LCD is useless. The amount of dimming zones needed to make it good is just unfeasible. Also props for testing a good and not widely known enough game in that photo.


----------



## MistaSparkul

boredgunner said:


> Thanks for the photo demonstration as usual. Yeah, FALD LCD is useless. The amount of dimming zones needed to make it good is just unfeasible. Also props for testing a good and not widely known enough game in that photo.


Shadow Warrior 2 is also one of the few PC games with HDR10 support. The other games I can think of are Mass Effect:Andromeda and Destiny 2, both of which are crappy games.


----------



## levifig

I don’t even want 144Hz, but get me [email protected] on a panel 36’’ or bigger and I’ll be all over it. Bonus points for HDR (but not really that meaningful).

I’m rocking an LG 43UD79-B right now. Love the size for productivity, but I wouldn’t mind it being a tiny bit smaller (38-40in), and would looove for it to be 120Hz! 

FWIW, bigger panels = cheaper. These “tiny” 27’’ 4k panels are always gonna be more expensive because their manufacturing costs are significantly higher (more “dots per inch”).

What I really want is a panel like LG’s new 34WK95U-W, but maybe 38in (like the 38UC99-W) and **with 120Hz**!


----------



## keikei

MistaSparkul said:


> Shadow Warrior 2 is also one of the few PC games with HDR10 support. The other games I can think of are Mass Effect:Andromeda and Destiny 2, both of which are crappy games.


Its a slow accepting process for sure with HDR, but its getting more prevalent. That said I did see a patch for injustice2 regarding support a few months back. FFXV as well. I'm sure there are more, but those are the ones im interested in or have atm.


----------



## ToTheSun!

levifig said:


> I don’t even want 144Hz, but get me [email protected] on a panel 36’’ or bigger and I’ll be all over it. Bonus points for HDR (but not really that meaningful).
> 
> I’m rocking an LG 43UD79-B right now. Love the size for productivity, but I wouldn’t mind it being a tiny bit smaller (38-40in), and would looove for it to be 120Hz!
> 
> FWIW, bigger panels = cheaper. These “tiny” 27’’ 4k panels are always gonna be more expensive because their manufacturing costs are significantly higher (more “dots per inch”).


That's not always the case. There are $300 24'' 4K monitors, and even cheap $300-400 27" 4K monitors. It has very little to do with DPI, and more to do with defect tolerance, offer vs demand, and RoI.

Your 43" monitor is cheap for its size because it likely houses a panel that was already in production for TV's or other ends, and it doesn't even have a flicker-free backlight.


----------



## sblantipodi

So if FALD is bad next 4K monitors will be bad, so what's the best 4K gaming monitor for 2018?


----------



## MistaSparkul

sblantipodi said:


> So if FALD is bad next 4K monitors will be bad, so what's the best 4K gaming monitor for 2018?


Absolutely nothing. The only 4k gaming monitors you can buy now are the same crappy 60Hz stuff that has been around for years already like the Acer XB321HK and such. Looks like we'll be waiting for 2019 or later for a decent 4k 144Hz monitor.


----------



## dansi

ToTheSun! said:


> That's not always the case. There are $300 24'' 4K monitors, and even cheap $300-400 27" 4K monitors. It has very little to do with DPI, and more to do with defect tolerance, offer vs demand, and RoI.
> 
> Your 43" monitor is cheap for its size because it likely houses a panel that was already in production for TV's or other ends, and it doesn't even have a flicker-free backlight.


Yes i stay clear of LG LCD, 6bit+frc pentile rgbw panel, ips glow 700-800 native contrast. Terrible for pc use.


----------



## animeowns

sblantipodi said:


> So if FALD is bad next 4K monitors will be bad, so what's the best 4K gaming monitor for 2018?


most likely the 65 inch 120hz gsync 4k one


----------



## levifig

ToTheSun! said:


> That's not always the case. There are $300 24'' 4K monitors, and even cheap $300-400 27" 4K monitors. It has very little to do with DPI, and more to do with defect tolerance, offer vs demand, and RoI.
> 
> Your 43" monitor is cheap for its size because it likely houses a panel that was already in production for TV's or other ends, and it doesn't even have a flicker-free backlight.


Yes and no. Yes, LG is probably trying to make use of 43’’ panels they no longer use in quantities for TVs, but this is REALLY nice IPS panel, which is not that common in TVs anyway. As for the flickr-free backlight, it does have it: a firmware update increased PWM from 110Hz to 480Hz so there’s no flicker whatsoever.  This specific panel is a bit odd in terms of market positioning: it’s not “cool” and “fancy” so they charged less…


----------



## ttnuagmada

Just to put things to rest, Mark Henninger of AVSforum measured a sub 20ms input lag on the new FALD Samsung Q9. The FALD can't even be disabled on this set.

http://www.avsforum.com/first-impressions-2018-samsung-q9f-65-qled-4k-hdr-tv/


----------



## ToTheSun!

ttnuagmada said:


> Just to put things to rest, Mark Henninger of AVSforum measured a sub 20ms input lag on the new FALD Samsung Q9. The FALD can't even be disabled on this set.
> 
> http://www.avsforum.com/first-impressions-2018-samsung-q9f-65-qled-4k-hdr-tv/


The first impressions, especially contrast and estimated dimming zone count, are very good!


----------



## ttnuagmada

ToTheSun! said:


> The first impressions, especially contrast and estimated dimming zone count, are very good!


There are a few members that already have sets, people are saying the motion is still not as good as Sony. Another guy said the set is no better than his KS9800. 

It would be nice to know what the static panel contrast is, but theres no way to measure that if the FALD cant be turned off. I bet theres some way to do it in the service menu. D-Nice has one too, it will be interesting to see his thoughts on it.


----------



## CallsignVega

ttnuagmada said:


> Just to put things to rest, Mark Henninger of AVSforum measured a sub 20ms input lag on the new FALD Samsung Q9. The FALD can't even be disabled on this set.
> 
> http://www.avsforum.com/first-impressions-2018-samsung-q9f-65-qled-4k-hdr-tv/


Not as many zones as the Z9D and smallest set is 65". No reason to ever get this over a 2018 OLED.


----------



## Scotty99

CallsignVega said:


> Not as many zones as the Z9D and smallest set is 65". No reason to ever get this over a 2018 OLED.


Really depends on the person, colors on samsung are actually so much better than anyone else on the market i would not fault someone for picking it over an oled (not myself, but i get it).


----------



## animeowns

CallsignVega said:


> Not as many zones as the Z9D and smallest set is 65". No reason to ever get this over a 2018 OLED.


whats the point in upgrading from 2017 oled to 2018 oled will 2018 oled have hdmi 2.1 I know the 2019 models will ?


----------



## CallsignVega

Not huge changes but some testing coming in from AVSForum. 2018's are slightly brighter, have larger pixels and logo dimming, both very helpful reducing any burn in propensity. Also better processor for watching movies. Input lag remains the same as the 2017's at an acceptable 21ms.

Buy 2018, sell it in 2019 and buy 2019. Problem solved.


----------



## Glerox

https://www.anandtech.com/show/12561/nvidia-expects-4k-144-hz-gsync-hdr-displays-april

"NVIDIA Expects 4K 144 Hz G-Sync HDR Displays to Launch in April. Technically, it's still Q1 because NVIDIA’s Q1 FY2019 (2019 because NV's fiscal year is one year ahead) ends on April 29, 2018..."

Wanna bet it's never gonna release in April?


----------



## Malinkadink

Glerox said:


> https://www.anandtech.com/show/12561/nvidia-expects-4k-144-hz-gsync-hdr-displays-april
> 
> "NVIDIA Expects 4K 144 Hz G-Sync HDR Displays to Launch in April. Technically, it's still Q1 because NVIDIA’s Q1 FY2019 (2019 because NV's fiscal year is one year ahead) ends on April 29, 2018..."
> 
> Wanna bet it's never gonna release in April?



Wanna bet its gonna release in April and have useless local dimming and in effect useless HDR with the low contrast ratio due to it being an IPS?  

I really would rather just have a 4k 144hz monitor without LD and HDR if it meant it cost less than $1k.


----------



## ttnuagmada

Scotty99 said:


> Really depends on the person, colors on samsung are actually so much better than anyone else on the market i would not fault someone for picking it over an oled (not myself, but i get it).


Thats a very big overstatement. rtings measured the Q9 as having a 2% better gamut coverage than the E7. That is not even a visually noticeable difference. Their color volume claims from last year are bunk too, because the set would need to be able to do its minimum black level and max white output simultaneously in the same zone to have a real 100% color volume. Their "100% color volume" claim is literally no better than dynamic contrast numbers.


----------



## Scotty99

ttnuagmada said:


> Thats a very big overstatement. rtings measured the Q9 as having a 2% better gamut coverage than the E7. That is not even a visually noticeable difference. Their color volume claims from last year are bunk too, because the set would need to be able to do its minimum black level and max white output simultaneously in the same zone to have a real 100% color volume. Their "100% color volume" claim is literally no better than dynamic contrast numbers.


It doesnt sound like a big difference on paper but that is honestly why samsung sets sell so well, the vibrancy they put out. It really is a noticeable difference when you see the sets side by side, their quantum dot tech combined with a better screen coating just give a more lively picture that most people deem as "better" for normal viewing activiites (cable tv looks a lot better on samsung for example).

Id never pick an LED tv over an OLED but there is a market for bright vibrant displays, something OLED currently cant do.

Side note, isnt the goal of QLED to be self emitting like OLED? I remember back a few years ago first reading about it but the tech still relies on a backlight.


----------



## ttnuagmada

I don't mean to argue, but I think you're confusing picture elements. The SUHD/QLED sets can get really bright, that might be what you're talking about I guess? Yes they get brighter than OLED, but OLED sets are double the nit output of any TV made prior to 2013, and that kind of brightness is overkill for anything other than HDR highlights. The QLEDs look extra impressive on a showroom floor due to their ability to get searing bright, combined with the showroom lighting hiding the inferior contrast. Believe me when I say you aren't going to want to watch most types of content with the brightness turned up like that. You will get eyestrain, and the poor blacks will be readily apparent. In a dim or dark room, the OLED is going to look more vibrant, because contrast is what gives a picture that "pop". 

It's important to know what QLED is; at this stage, it's absolutely nothing other than a film that is used to turn a blue LED into a white light. This creates a purer white than typical whitening filters do. However, the only aspect of the picture this improves at all, is the maximum color gamut/volume. This isn't even something you benefit from unless watching a UHD movie, and the improvements to gamut over current OLED models are too small to even be discernable by the human eye. 

I personally own a KS8000, which is the model before they randomly decided to start calling them "QLED". It has a very slightly lower gamut/volume coverage than the QLED models, but is otherwise as good or better than the Q7 according to most reviews. The 2016 panels have a higher static contrast, and get searing bright like the QLEDs. It uses quantum dots in the same manner as the QLEDs. The KS8000's picture is great for an LCD, but in a dark room, it still can't touch my Panasonic ST60 plasma for non-HDR content. There is nothing more "vibrant" about the colors compared to the plasma (possibly the opposite actually) when both are calibrated (I have a colorimeter/spectro/Calman). The KS8000's native color gamut reaches a higher saturation than the ST60, but that's useless for anything but HDR content, and otherwise just oversaturates the colors, making them innacurate. 

Bottom line: don't compare TV's on a showroom floor. Your eyes will play tricks on you. 



> that most people deem as "better" for normal viewing activiites (cable tv looks a lot better on samsung for example).


Who is "most people"? Again, not to be rude, but you're saying these things, and there is not really anything out there that backs any of that up. You read like a collection of pro-Samsung sinpets that you read somewhere. Rtings for example, rates the C7 as the best TV for actual cable/television watching. 



> Id never pick an LED tv over an OLED but there is a market for bright vibrant displays, something OLED currently cant do.


They simply get super bright, and only relative to the OLED. Relative to any TV made prior to 2013, or hell, even most TV's in general anyway, the OLED gets as bright or brighter. The OLED gets bright enough that all of the pro's at the 2017 VE shootout still gave it the best grade for HDR, even over the Q9 and Z9D. Like I said earlier, and this is from experience as I own a 2016 Samsung. You are absolutely not ever going to turn it up as bright as it will go to watch normal content in normal viewing conditions. It will literally make your eyes hurt. That crazy brightness is only there for occasional highlights in HDR content. 



> Side note, isnt the goal of QLED to be self emitting like OLED? I remember back a few years ago first reading about it but the tech still relies on a backlight.


Eventually. Right now the implementation in the QLED sets is absolutely no different than in any Quantum Dot display made in the last few years. It's just red/green QD's lit by a blue LED to create a white backlight. They've just improved it incrementaly, causing gamut coverage to go up a couple of % every year. The current QLED naming is just marketing. The 2017 models were really not any better than the 2016 model Samsungs (the KS9800 ws better than any QLED set from 2017). 

The next step is to put the QD's in the color filter, which should allow for increased brightness (they wont need the filter with the QD film, so more light will get through) and increased viewing angle. Whenever they finally do this (maybe 2019) will be the first time it would be worthy of the QLED moniker IMO


----------



## animeowns

Glerox said:


> https://www.anandtech.com/show/12561/nvidia-expects-4k-144-hz-gsync-hdr-displays-april
> 
> "NVIDIA Expects 4K 144 Hz G-Sync HDR Displays to Launch in April. Technically, it's still Q1 because NVIDIA’s Q1 FY2019 (2019 because NV's fiscal year is one year ahead) ends on April 29, 2018..."
> 
> Wanna bet it's never gonna release in April?


hell yeah ready to get down with a pre order I hope an April release date holds some validity nvidia.


----------



## Scotty99

Ive owned a KS8000, vizio P55 and a B6 from 2016. There is simply something about samsung displays that has that pop that no other screens do, its a combination of nits quantum dots and the screen coatings they use. 

Really dont feel the need to bullet point everything you mentioned to avoid cluttering the thread up, it simply boils down to having owning these screens and using them on the daily. Rtings is a nice starting point but its not going to explain to you the differences before buying, for example they rate "best tv for cable" as how well it upscales crappy 1080i broadcasts to 4k, while thats surely part of it thats not everything.


----------



## toncij

I had to double-check in what thread I am...
Anyway, I'll preorder 3 of these if they actually release. But I seriously doubt it...


----------



## hrockh

I hope they stick with the April release..


----------



## ttnuagmada

Scotty99 said:


> Ive owned a KS8000, vizio P55 and a B6 from 2016. There is simply something about samsung displays that has that pop that no other screens do, its a combination of nits quantum dots and the screen coatings they use.
> 
> Really dont feel the need to bullet point everything you mentioned to avoid cluttering the thread up, it simply boils down to having owning these screens and using them on the daily. Rtings is a nice starting point but its not going to explain to you the differences before buying, for example they rate "best tv for cable" as how well it upscales crappy 1080i broadcasts to 4k, while thats surely part of it thats not everything.


This is clearly an issue of not having the settings dialed in. There is a reason that literally no one shares your opinion, especially with the edgelit models. Most don't even consider the samsung sets to be as good as Sony's. Quantum Dots dont have anything to do with anything you think you're perceiving. QD's are just the particular method Samsung uses to reach a high color gamut, witch you aren't even benefiting from in most content. It is not the only way to do it either. You are being bamboozled by marketing. 

Also, the P55 is IPS. Even FALD cant save an IPS screen from dumpster contrast.

If you think your KS8000 looks better than the B6, then you did something wrong. The KS8000 doesnt even look better than my mid-tier plasma from 5 years ago.


----------



## Scotty99

Ya you read wrong there bud lol. OLED is a preference of mine because i watch movies in a dimly lit room, i am simply pointing out the fact samsung TV's have a vibrance that OLED cannot match today, its something you have to experience for yourself.

You sound like a dude who sits online reading reviews of this and that and never actually buys these products, get a samsung QLED in your home next to a OLED and you will understand what im talking about. OLED has limitations beyond brightness, i dont know if its the screen coating they use or what, but it comes off far less vibrant than samsung pulls off.

Side note the P55 vizio makes is the best TV in their lineup as an all arounder. What you give up in contrast is more than made up for with incredible viewing angles and i think it looks better than the 65" in regards to colors (ive owned both).


Edit
BTW the samsung KS8000 from 2016 is the 7th highest rated TV Rtings has ever reviewed, tying he sony x930E. Its basically considered one of samsungs best TV's ever made, and is superior to all of the 2017 lineup. Here is a video comparison of a 2016 OLED vs the KS8000 to try and give you an idea of what i mean by vibrance:





Quote from the guy who made the video: (basically shares my thoughts on the sets)
"I'm finding the ks8000 better for daytime watching, sports, pc usage, more accurate colors and better reproduction of bright scenes. The OLED is better for very dark scenes and night time movies. Scenes where there are dark areas and bright areas on the same image is a coin flip. I tend to reach for the Samsung remote more often than the LG"

It really comes down to each individual person and their habits on which TV set they will like best, that is all im trying to get across.

But anyways that will be my last post about TV's in here, just wanted to make a point about FALD and it went off the rails


----------



## ttnuagmada

Scotty99 said:


> Ya you read wrong there bud lol. OLED is a preference of mine because i watch movies in a dimly lit room, i am simply pointing out the fact samsung TV's have a vibrance that OLED cannot match today, its something you have to experience for yourself.



You are literally talking about brightness. Your video does nothing but show what the KS looks like next to an OLED when the backlight is turned up high enough to make it brighter than the OLED. Placing the OLED next to the QLED with the backlight on the QLED turned up, is exactly why you don't even know what you're looking at, and your comparison is meaningless. 

I also own a KS8000. When both are calibrated for the same light output, there is nothing more "vibrant" about it compared to my ST60, much less an OLED. You are misusing terms and making poor comparisons. Placing a brighter TV next to a dim one is not how you compare displays, especially in a dark room. Your eyes (or camera) will adjust to the brighter display. 

The KS8000 is a great TV, but it does not have some sort of magic intangible PQ quality. That is literally your eyes playing tricks on you.



> You sound like a dude who sits online reading reviews of this and that and never actually buys these products


I own the television you think is some sort of magical, vibrant witchcraft. You sound like a dude who probably has the color space turned on Native while he watches SDR content, and thinks the inaccurate over-saturated colors means that the TV is more "vibrant"


----------



## keikei

Glerox said:


> https://www.anandtech.com/show/12561/nvidia-expects-4k-144-hz-gsync-hdr-displays-april
> 
> "NVIDIA Expects 4K 144 Hz G-Sync HDR Displays to Launch in April. Technically, it's still Q1 because NVIDIA’s Q1 FY2019 (2019 because NV's fiscal year is one year ahead) ends on April 29, 2018..."
> 
> Wanna bet it's never gonna release in April?


I'm glad to see no late 2018 delay. It seems asus is the only brand in town to have these babies out this year? I'm very surprised dell hasnt announced anything 4k/high hz. They have an 8k monitor available for god's sake.


----------



## Glerox

keikei said:


> I'm glad to see no late 2018 delay. It seems asus is the only brand in town to have these babies out this year? I'm very surprised dell hasnt announced anything 4k/high hz. They have an 8k monitor available for god's sake.


Maybe they don't have the exclusivity of the infamous Nvidia's partner program


----------



## Malinkadink

ttnuagmada said:


> You are literally talking about brightness. Your video does nothing but show what the KS looks like next to an OLED when the backlight is turned up high enough to make it brighter than the OLED. Placing the OLED next to the QLED with the backlight on the QLED turned up, is exactly why you don't even know what you're looking at, and your comparison is meaningless.
> 
> I also own a KS8000. When both are calibrated for the same light output, there is nothing more "vibrant" about it compared to my ST60, much less an OLED. You are misusing terms and making poor comparisons. Placing a brighter TV next to a dim one is not how you compare displays, especially in a dark room. Your eyes (or camera) will adjust to the brighter display.
> 
> The KS8000 is a great TV, but it does not have some sort of magic intangible PQ quality. That is literally your eyes playing tricks on you.
> 
> 
> 
> I own the television you think is some sort of magical, vibrant witchcraft. You sound like a dude who probably has the color space turned on Native while he watches SDR content, and thinks the inaccurate over-saturated colors means that the TV is more "vibrant"


I have a C7 in my room calibrated and an ES8000 from 2012 in the living room that my parents use. The day they got that TV they thought it looked great in the vivid overly saturated maximum brightness mode. I could see how to untrained eyes how that image can look attractive, it does pop and get your attention. I've since calibrated the ES8000 as well and it looks really good now or well as good as a 2012 1080p set can anyway. 

Comparing displays out of the box is definitely a poor way to do it especially when they're not even the same technologies. Calibrate them both to the same target and the OLED will absolutely poo all over the LCD until you throw up a 5% grey slide


----------



## NewType88

I'm pretty new to PC's and have never bought a high end monitor. Do new monitors like this in this price range sell out quickly at launch ? I figure its not mainstream display because of the price range, but due to the fact that they would produce them at lower volumes and have less units available initially at launch, might make them more scares. 

lets assume all the features are working as desired, would I be waiting that much longer to buy one or should I pre-order it ? Id rather wait to see what the reviews say. I like the look of the ACER better, but im also impatient ;p


----------



## ttnuagmada

Malinkadink said:


> Calibrate them both to the same target and the OLED will absolutely poo all over the LCD until you throw up a 5% grey slide /forum/images/smilies/tongue.gif


Yeah, the perfect uniformity is one thing that will be hard to let go of when the time comes to upgrade the St60. I'm hoping they'll have that under control by then.


----------



## toncij

Can someone confirm the alleged price of $2999? ://


----------



## CallsignVega

ttnuagmada said:


> You are literally talking about brightness. Your video does nothing but show what the KS looks like next to an OLED when the backlight is turned up high enough to make it brighter than the OLED. Placing the OLED next to the QLED with the backlight on the QLED turned up, is exactly why you don't even know what you're looking at, and your comparison is meaningless.
> 
> I also own a KS8000. When both are calibrated for the same light output, there is nothing more "vibrant" about it compared to my ST60, much less an OLED. You are misusing terms and making poor comparisons. Placing a brighter TV next to a dim one is not how you compare displays, especially in a dark room. Your eyes (or camera) will adjust to the brighter display.
> 
> The KS8000 is a great TV, but it does not have some sort of magic intangible PQ quality. That is literally your eyes playing tricks on you.


Very true. The layman goes into best buy and their jaw drops at whichever TV is the brightest. Some of us know better. A TV only needs to get quite bright for small HDR highlights. My C7 would make me squint in some HDR scenes that I had to turn it down. If you are watching a TV with bright sunlight coming through a wall of glass behind you, you are doing it wrong. 

The only reason Samsung goes on and on about their LCD TV brightness is because they need that brightness to hide all the flaws. Just ask any pro; they'd take 800 nits with infinite contrast ratio over 1400 nits with LCD contrast ratio and haloing any day. 

Reminds me of people that compare speakers and it's always the loudest one that they think is best.


----------



## brab

Glerox said:


> Wanna bet it's never gonna release in April?


You sir are correct!


----------



## Malinkadink

brab said:


> You sir are correct!


I don't like how that post says monitors don't wait for graphics cards, because they sure do when G-sync is involved, that's kind of the whole point, GPU + Monitor in sync to prevent screen tearing. 


As for rumored price of $3000 now? All i have is a big LOL for that.


----------



## MistaSparkul

Oh geez $3k now. Can't we just get a "normal" 4k 144Hz monitor without all this FALD nonsense for 1/3 the price?


----------



## CallsignVega

And not at a tiny 27"...


----------



## dansi

CallsignVega said:


> Very true. The layman goes into best buy and their jaw drops at whichever TV is the brightest. Some of us know better. A TV only needs to get quite bright for small HDR highlights. My C7 would make me squint in some HDR scenes that I had to turn it down. If you are watching a TV with bright sunlight coming through a wall of glass behind you, you are doing it wrong.
> 
> The only reason Samsung goes on and on about their LCD TV brightness is because they need that brightness to hide all the flaws. Just ask any pro; they'd take 800 nits with infinite contrast ratio over 1400 nits with LCD contrast ratio and haloing any day.
> 
> Reminds me of people that compare speakers and it's always the loudest one that they think is best.


I have last year B7 OLED, before i bought it, i was concerned with brightness.
Now i using it for a year, and loving it. Watching day time TV, HDR games, and 4K netflix. 
The color stability is amazing, you cannot get that from LED back lighting. I connect my PC, windows 10 is like a 2D ink painting, so rich and stable.
No flowery light bleed into my colors! The brightness in day time is no issue! Enough richness. So richie rich~ With SDR programming, it looks better than my old trinitron tube CRT colors! 
I probably needed to buy the expensive Sony FALD ZD9 to come close. 

Yes i wish it could do 1000nits HDR in games, but most games still figuring how to do HDR, mainly i get the bright sun effect thing.

OLED 40" need to come down to PC monitor space one day.....


----------



## kot0005

I think these monitors are just false advertising for HDR/FALD and Quantum dot or just real but cheap versions of the real stuff.

Like even with qdot, they can only do 100% srgb. Samsung's qdot is 125%.

FALD has haloing issues because of high latency hardware ? software ?

and for HDR idk how they hit 1000nits because the new HDR pro art rated at 400nits but can magically hit 1000nits in HDR..


----------



## ttnuagmada

kot0005 said:


> I think these monitors are just false advertising for HDR/FALD and Quantum dot or just real but cheap versions of the real stuff.
> 
> Like even with qdot, they can only do 100% srgb. Samsung's qdot is 125%.


Most manufacturers will only list up to 100%. IE it may do way more than that, but they're simply stating that it can do "full" sRGB

Check out dells HDR monitor for instance:

http://www.dell.com/en-us/shop/dell...18q/apd/210-amvp/monitors-monitor-accessories

It does 100% Adobe and 97.7% DCI-P3, so it obviously does well north of 100% sRGB, but saying so isn't really of any use to anyone if Adobe/DCI-P3 numbers are there as well. They have it listed as 100% sRGB.





> FALD has haloing issues because of high latency hardware ? software ?
> 
> and for HDR idk how they hit 1000nits because the new HDR pro art rated at 400nits but can magically hit 1000nits in HDR..


Haloing is just the consequence of having a backlight that isn't nearly as granular as the amount of pixels the display has. IE if a display had 500 zones, that's only 1 zone per 16,600 pixels on a 4K display. IPS screens make it more noticable due to the poor native contrast of IPS panels.


----------



## animeowns

Malinkadink said:


> I don't like how that post says monitors don't wait for graphics cards, because they sure do when G-sync is involved, that's kind of the whole point, GPU + Monitor in sync to prevent screen tearing.
> 
> 
> As for rumored price of $3000 now? All i have is a big LOL for that.


Nothing official yet maybe we will still get it in April keeping the hope alive ready to pre order early April with a late April launch.


----------



## brab

Malinkadink said:


> As for rumored price of $3000 now? All i have is a big LOL for that.


Where did you find that?
From what I saw before Acer was supposed to be in that range and Asus in the $2k range.



animeowns said:


> Nothing official yet maybe we will still get it in April keeping the hope alive ready to pre order early April with a late April launch.


I am more inclined to trust what an Asus administrator says than what Nvidia wishes. 
We are only talking about a month difference anyway.


----------



## animeowns

*4k 144hz monitor updates B&H Photo*

4k 144hz updates B&H chat log attached


----------



## rvectors

I believe the latest rumour is 4k for 4k, a steal, you even get a halo effect thrown in for free. ...and with the DXG-2, we know have a GPU to run it at 144Hz


----------



## sblantipodi

CallsignVega said:


> And not at a tiny 27"...


27 is a desktop monitor, if you need a bigger one, buy a TV and a sofa.


----------



## JackCY

CallsignVega said:


> And not at a tiny 27"...


27"... well depends where you sit, 70-80cm away it's fine and things don't tend to look gigantic as they do on 31.5": movies, UI elements, etc.
4k 30" would be better. If they price it anywhere near OLED TVs there is little point in getting these for the high price.

Local dimming is neat and useful the more zones the better, per subpixel at best.


----------



## MistaSparkul

32 inch is also desktop monitor size.


----------



## bee144

*****. “It’s coming out in Q1 2018... oh wait we can’t make that. April 2018 then. Oh wait, we can’t make that either. May 2018. Crap, now it will be LATE June 2018.” What a mess


----------



## Profiled

sblantipodi said:


> 27 is a desktop monitor, if you need a bigger one, buy a TV and a sofa.


cant wait!


----------



## toncij

I'd also wish for a 32" one 4K rather than 27"...


----------



## keikei

toncij said:


> I'd also wish for a 32" one 4K rather than 27"...


Is there a 32 inch version? That would be perfect for me. I'm also tempted looking at this LG.


----------



## DefenderCast

Feels like vaporware to me.


----------



## toncij

keikei said:


> Is there a 32 inch version? That would be perfect for me. I'm also tempted looking at this LG.


No, only 27".
That LG will probably never come out...


----------



## Sancus

It would be the perfect final nail in the coffin for this display to come out after 1.5 years of delays with a price point of $3000. And I bet they'll make it impossible to use the FALD with SDR content just to add insult to injury.


----------



## ryan92084

keikei said:


> Is there a 32 inch version? That would be perfect for me. I'm also tempted looking at this LG.





toncij said:


> No, only 27".
> That LG will probably never come out...


$1500 and june. BHPhoto is generally pretty good about release dates https://www.bhphotovideo.com/c/product/1382968-REG/lg_34wk95u_34_nano_ips.html


----------



## animeowns

*PG27uq out for pre order on europe retailers*

pre orders are up for the pg27uq on europe sites with an expected in stock date of april 23rd I hope this is also true for the USA stock date

https://translate.google.com/transl...-art-90lm03a0-b01370-num-7368041/&prev=search


----------



## Glerox

2245 euros lol!


----------



## animeowns

Glerox said:


> 2245 euros lol!


how much is that in usd I expected it would be $1500-2500 ready to pay


----------



## Malinkadink

animeowns said:


> how much is that in usd I expected it would be $1500-2500 ready to pay


2,765.53 in USD, do people even bother to google anything anymore? 

Anyways good luck with paying $2500 for this thing, absolute rip off, can't wait to hear the horror stories and just overall the complaints that will surely follow these monitors same as the 1440p 144hz monitors before them.


----------



## 12345us3r

No matter how good the new 4K 144hz monitors are gonna be, they will be overpriced.


----------



## CallsignVega

Glerox said:


> 2245 euros lol!


Did you not notice the free shipping. 

Haha that is what I paid for my new 2018 55" 4K OLED. Crazy price for a 27" 4K IPS monitor!


----------



## bee144

CallsignVega said:


> Did you not notice the free shipping.
> 
> Haha that is what I paid for my new 2018 55" 4K OLED. Crazy price for a 27" 4K IPS monitor!


Don't forget that it's 8 bit panel as well!

If it is indeed $2,500, it's a rip off. Wouldn't pay more than 2k. Gotta be 1.9k or less for what it's offering.


----------



## Fanu

its a first gen product, of course its going to be **** expensive 

it will probably be a failure with bunch of customer returns - just like with most of these "cutting edge" gaming monitors that suck

I'm buying a new gaming PC this summer and I am honestly at a loss which gaming monitor to buy - freesync monitors mostly suck (+there are no high end amd GPUs available, at least not at a reasonable price), gsync ones are way overpriced ever since appearing on the market


----------



## animeowns

Malinkadink said:


> 2,765.53 in USD, do people even bother to google anything anymore?
> 
> Anyways good luck with paying $2500 for this thing, absolute rip off, can't wait to hear the horror stories and just overall the complaints that will surely follow these monitors same as the 1440p 144hz monitors before them.


well for my needs it won't be a rip off I remember just buying the dell 8k at $5000 now that was a rip off but it was all in the name of gaming and now I know what kind of gpu's we need just to push 8k 32gbs of vram on the gv100 looks nice hope we get that on the gaming cards in july you can now purchase the dell 8k for about $2840 on wal mart


----------



## animeowns

CallsignVega said:


> Did you not notice the free shipping.
> 
> Haha that is what I paid for my new 2018 55" 4K OLED. Crazy price for a 27" 4K IPS monitor!


how are the 2018 oled models compared to the 2017 ?


----------



## CallsignVega

Just a quick summary for the 2018 sets:
1) Better motion processing
2) Better upscaling
3) Manual calibration is about the same but now there is a new Autocal feature
4) So far, users are reporting much better panel uniformity and little or no visible banding
5) Brighter panel, about 100-150 nits brighter, HDR noticably brighter with improved HDR tone mapping and DV performance (brighter picture)
6) Static logo dimming to try to prevent image retention.


----------



## Malinkadink

CallsignVega said:


> Just a quick summary for the 2018 sets:
> 1) Better motion processing
> 2) Better upscaling
> 3) Manual calibration is about the same but now there is a new Autocal feature
> 4) So far, users are reporting much better panel uniformity and little or no visible banding
> 5) Brighter panel, about 100-150 nits brighter, HDR noticably brighter with improved HDR tone mapping and DV performance (brighter picture)
> 6) Static logo dimming to try to prevent image retention.


All pretty positive changes then, should make 2019 with 4k 120hz a great buy. I'm perfectly content right now however with my 2017 C7 for $1200 new it was an easy decision. Paying double for those 6 things you listed there just didn't feel worth it whatsoever. Still need LG to come through with smaller monitor friendly sizes tho


----------



## Kommando Kodiak

nevermind formatting


----------



## animeowns

PG27UQ pre order page next delivery date is april 24th

http://shop.scheuss-partner.ch/products/3174740


----------



## animeowns

Malinkadink said:


> All pretty positive changes then, should make 2019 with 4k 120hz a great buy. I'm perfectly content right now however with my 2017 C7 for $1200 new it was an easy decision. Paying double for those 6 things you listed there just didn't feel worth it whatsoever. Still need LG to come through with smaller monitor friendly sizes tho


ya I will be waiting for the 2019 models as well


----------



## Kommando Kodiak

http://www.game-debate.com/news/248...hdr-monitors-from-acer-and-asus-start-at-3000 news orgs picking up on it finally


----------



## animeowns

Kommando Kodiak said:


> http://www.game-debate.com/news/248...hdr-monitors-from-acer-and-asus-start-at-3000 news orgs picking up on it finally


oh if thats the case and it will be a few more days of waiting statesite I can do that I don't mind paying the $2000 I just know after spending $5000 on the dell 8k last year I look forward to gaming at 4k 144hz since its actually something within reach atm unlike 8k unless you buy the quadro 32gb vram video cards

In any case it should be in both markets soon as the panel that is being used by Asus and Acer is now in production for the 4k 144hz

http://www.panelook.com/M270QAN02.0_AUO_27.0_LCM_overview_28341.html


----------



## Glerox

I've written to some European stores for the PG27UQ and no one is shipping to America.

It drives me crazy that you can't pre-order it in North America.... I've been waiting over a year


----------



## Kommando Kodiak

it wont be much longer now glerox, I know your frustration


----------



## sblantipodi

prices are completely mad


----------



## animeowns

sblantipodi said:


> prices are completely mad


what are the US market prices ?


----------



## boredgunner

^ Looks like we don't know yet. I'm not expecting $3,000.


----------



## Caleer

Kommando Kodiak said:


> it wont be much longer now glerox, I know your frustration


I too am checking daily in hopes of something, been waiting since the CES of 2017 to upgrade my old 1080 monitors that are in need of retiring.


----------



## animeowns

Caleer said:


> I too am checking daily in hopes of something, been waiting since the CES of 2017 to upgrade my old 1080 monitors that are in need of retiring.


I just sold off my 240hz 1080p monitor in preparation for this 4k 144hz panel it should be a nice experience if it will cost $2500 I am ready and willing to pay that price and will get an extended warranty on it as I will be gaming on it for 3+ years for sure maybe even longer I won't be going up above 4k until we get 5k and 8k panels that can go higher than 60 hz refresh rate and gpus to push it


----------



## Jbravo33

im ready! volta is ready!


----------



## animeowns

Jbravo33 said:


> im ready! volta is ready!


asus updated the page on the main website so I have a feeling we will be seeing this display very soon PG27UQ


----------



## Jbravo33

animeowns said:


> asus updated the page on the main website so I have a feeling we will be seeing this display very soon PG27UQ


sweet! i hope they both drop cuz i really like the acer version better but at this point i'll take whatever.


----------



## bee144

animeowns said:


> Jbravo33 said:
> 
> 
> 
> im ready! volta is ready!
> 
> 
> 
> asus updated the page on the main website so I have a feeling we will be seeing this display very soon PG27UQ
Click to expand...

That’s been updated for at least two months now. ASUS rep on their forum said firmware is still being developed and documentation is being written.


----------



## l88bastar

Getting my body ready!


----------



## Leopardi

2599€ in Finland.

2599€ for IPS glowing piece of garbage without even an A-TW polarizer. Probably terrible haloing effects in FALD mode as well. Wish they just made an OLED at this price.


----------



## ToTheSun!

Leopardi said:


> 2599€ in Finland.
> 
> 2599€ for IPS glowing piece of garbage without even an A-TW polarizer. Probably terrible haloing effects in FALD mode as well. Wish they just made an OLED at this price.


Even a 120 Hz version of that 22" OLED panel would be better.


----------



## animeowns

bee144 said:


> That’s been updated for at least two months now. ASUS rep on their forum said firmware is still being developed and documentation is being written.


well the panels in europe have a ship out date listed in 3 weeks so we are looking at a early may/ mid may release date I think


----------



## animeowns

Leopardi said:


> 2599€ in Finland.
> 
> 2599€ for IPS glowing piece of garbage without even an A-TW polarizer. Probably terrible haloing effects in FALD mode as well. Wish they just made an OLED at this price.


is it in stock ready to ship there or is that pre order ?


----------



## Leopardi

animeowns said:


> is it in stock ready to ship there or is that pre order ?


One shop says 19 units coming 7th may


----------



## animeowns

Leopardi said:


> One shop says 19 units coming 7th may


what website says that I am currently trying to get a europe site to international ship me a model to the US.


----------



## Leopardi

animeowns said:


> what website says that I am currently trying to get a europe site to international ship me a model to the US.


komplett


----------



## Profiled

not worthy to buy from EU. dollar to EUR is 1.2


----------



## Sancus

Lol are people really going to import this thing from the EU and pay hundreds of dollars in shipping just to be the first to discover that the haloing reported by several youtubers and display reviewers at CES is definitely still there and wow look my panel has terrible IPS glow because AUO did no QC as usual? Good luck with returns.


----------



## toncij

Might be a problem, yes. And with that price you're very close to Dell's 8K...


----------



## ToTheSun!

toncij said:


> Might be a problem, yes. And with that price you're very close to Dell's 8K...


I mean, they're not comparable monitors, but, if you're going that route, one can even go as far as saying quasi-glossy 4K OLED is CHEAPER.


----------



## animeowns

toncij said:


> Might be a problem, yes. And with that price you're very close to Dell's 8K...


but you will be able to game on this 4k 144hz where as that dell 8k you are very unlikely to run it at decent settings with the current hardware out now I plan on keeping the 4k 144hz panel for at least 3 or more years until we have 5k panels at 100hz


----------



## NewType88

Hey if you got the means, then by all means ! sounds like a good reason for USA asus not to honor a warranty for a euro model though.
I expect to see an in-depth review with this sort of commitment ! ;p


----------



## LunaTiC123

Can't wait to see the posts about these monitors with all the issues, people thinking it's gonna be better this time... LOL


----------



## Sancus

I feel pretty strongly that anyone pre-ordering this thing for >$3000 before there is an announced US price and before there is a review from prad.de, tftcentral, or similarly critical monitor reviewer is crazy. And I am absolutely a person who has spent >$3000 on displays in the past, and will again. When you are spending that much you want to be sure you are getting what you expect.

Of course, it's your money.


----------



## Malinkadink

I'd be more interested in a 24 inch 1440p 240hz monitor for <$1,000, oh and i'd want it to be at least IPS, but i dont think anything other than TN can handle 240hz properly besides OLED, in which case 1440p 24 inch 240hz OLED. Make it happen!


----------



## Kommando Kodiak

how far away are we from 4k 144hz oled gsync monitors?


----------



## Scotty99

I honestly couldnt give a flying you know what about 4k lol, OLED 24" 165+ hz at 1440p is what needs to happen.


----------



## l88bastar

Scotty99 said:


> I honestly couldnt give a flying you know what about 4k lol, OLED 24" 165+ hz at 1440p is what needs to happen.


A 32" 4k 120hz OLED is my Holy Grail display....

However, I would be all over a 1440p 165hz OLED if they were available!


----------



## Aristotelian

I'll reserve my "is it worth it?" judgment until I see a detailed review. Honestly now, some people have been saying for over a year that "this monitor isn't worth it unless it's USD 999 because it's [too small] / [not OLED] / [bad tech] / [necessarily subject to QC issues]" but from a competitive position in the market this sort of pricing desire is just delusional wishful thinking.

I was expecting it to debut at around USD 1999 / EUR 1999 (not fair from an exchange rate perspective but it never is), and these 'pre order' links in the EU that people are finding don't appear very legitimate to me. I have tried to access 4 of them and am told that the links no longer work, others have it over EUR 3000 now, which makes it seem like those pre-order links are essentially gambles - you'll end up buying from a smaller retailer that bought it on release day for much less and re-sold it for a significant markup. 

On people from the USA - if you think you're in a bad situation imagine you imported a Razer Blade 2014 for over USD 2400 in 2014 or so, and it was plagued with problems from the get go. One of the bricks on mine died and it cost me over USD 250 to replace it (importing one from the USA), and by the time Razer opened a store in Europe, my laptop was out of warranty. Turns out I was right about the 'plagued with problems' (which the EU re-seller on Amazon.co.uk stalled about), and Razer wanted me to pay about EUR 1300 to fully fix (panel, motherboard, battery) the issues that my laptop had from day 1. I didn't opt for that so I essentially bought a paper weight. The tl;dr is that if you're into high end tech it can be a terrible experience on either side of the pond.

I won't be buying this monitor if there are significant QC issues - I'm not playing panel lottery any more. And my new PC build has been waiting years for a high refresh rate HDR decently high resolution solution, and at 27". Here's hoping.


----------



## Scotty99

I just dont get how they think they can charge so much for these, i would rather buy a huge desk and put a 55" OLED on it for less money lol.


----------



## toncij

Yes, it's completely unrelated. Anyway, it's a bit on the expensive side for only 4K and being 32" with integer scaling that'll be 1920x1080 usable real-estate, which is garbage unless only for gaming. I was planing to get 3x this, but the price might come in as a bit of a bummer. 
144Hz, but I might rather buy 3x 22" OLEDs... 60Hz but OLED.


----------



## Sancus

Keep in mind that the PQ22UC(the 22" OLED display) is an Asus product as well using the first commercial panel from an untested manufacturer, and in all likelihood it's vaporware or won't be seen until 2019 or later. I'd love to see it this year but this industry's track record gets worse every year.


----------



## animeowns

Scotty99 said:


> I just dont get how they think they can charge so much for these, i would rather buy a huge desk and put a 55" OLED on it for less money lol.


that's actually what I'm using right now using a oled as my main display and the rog swift 4k 144hz will be my backup monitor I have enough room to use both side by side


----------



## toncij

Since I don't only game, I value clarity too so using 2x 5K Dells and will move to 8Ks in the future, but 144Hz 4K was a nice combo; 3 of those would add up for not having the height in pixels when scaled; still, the price... 
Will wait for a review by tftcentral and see. If it's really the best thing ever, might jump on it... blind? no.


----------



## Leopardi

Scotty99 said:


> I honestly couldnt give a flying you know what about 4k lol, OLED 24" 165+ hz at 1440p is what needs to happen.


They can easily bump it up to even 480Hz, when they just have the will to make a gaming monitor. The response times allow it at only 0.01ms, no need to even tinker with overdrive, as it's so blazing fast.


----------



## toncij

Leopardi said:


> They can easily bump it up to even 480Hz, when they just have the will to make a gaming monitor. The response times allow it at only 0.01ms, no need to even tinker with overdrive, as it's so blazing fast.


The problem is pricing and eating into much cheaper-to-produce (profitable) LCDs. OLED will take decades to come close at this pace.


----------



## animeowns

toncij said:


> Since I don't only game, I value clarity too so using 2x 5K Dells and will move to 8Ks in the future, but 144Hz 4K was a nice combo; 3 of those would add up for not having the height in pixels when scaled; still, the price...
> Will wait for a review by tftcentral and see. If it's really the best thing ever, might jump on it... blind? no.


if you wanna move to the 8k's now they are cheaper then what i had to pay last year of the $5000 per unit I have seen it as low as $2900 new


----------



## Leopardi

toncij said:


> The problem is pricing and eating into much cheaper-to-produce (profitable) LCDs. OLED will take decades to come close at this pace.


One of the arguments for OLED is that it'll be much cheaper to produce than LCD though. We'll see how the new massive LG factory opening up this summer will further drop OLED prices.


----------



## toncij

animeowns said:


> if you wanna move to the 8k's now they are cheaper then what i had to pay last year of the $5000 per unit I have seen it as low as $2900 new


Here in Europe it's about 3500 euro, which is closer to $4400 and since I need Windows too, can't move yet, not as good as expected (scaling wise) and not enough diff to move from 5K.



Leopardi said:


> One of the arguments for OLED is that it'll be much cheaper to produce than LCD though. We'll see how the new massive LG factory opening up this summer will further drop OLED prices.


Let's hope so.

Meanwhile, AUO might surprise us with a fantastic display!


----------



## ryan92084

Kommando Kodiak said:


> how far away are we from 4k 144hz oled gsync monitors?


Current year TV models already do 4k 120hz but only via in tv apps since hdmi 2.0 can't handle it. Maybe next year we'll get 2.1 for 120hz inputs and VRR. As for monitors who knows, LG has mentioned 40 inch panels several times but never moved on it.


----------



## animeowns

if I were you I would ditch that dell 5k and get the 5k ultrawide by lg the nanoips 5120x2160 34 inch https://www.bhphotovideo.com/c/product/1382968-REG/lg_34wk95u_34_nano_ips.html if it had a low response time and gsync I would pick it up in a heartbeat but Im more focused on these 4k 144hz gsync displays atm the dell 5k and 8k both in mouse input response time are very slow for gaming but if you are playing single player games it won't matter.


----------



## Sancus

PG27UQ FALD does not operate in SDR mode. So it only does something in ~20-30 games that exist. In the rest, this will have the same contrast as any random IPS monitor.


----------



## Scotty99

Sancus said:


> PG27UQ FALD does not operate in SDR mode. So it only does something in ~20-30 games that exist. In the rest, this will have the same contrast as any random IPS monitor.


There is zero chance that isnt a bug, FALD does not require HDR meta data.


----------



## Sancus

Scotty99 said:


> There is zero chance that isnt a bug, FALD does not require HDR meta data.


Uh if it was a bug an Asus rep wouldn't be explicitly saying that's how it functions. The Dell UP2718Q doesn't operate its FALD in SDR mode either. The only FALD HDR monitor that even has an option for this so far is the ProArt PA32UC.


----------



## Scotty99

Thats ludicrous if true, FALD is one of the main selling points of these things lol. 

I just dont understand i guess, TV's with FALD dont require HDR for FALD to work.


----------



## MCridercho

The guy from the Asus forum also said that there is global backlight dimming option available in the OSD. So it does work, or no...? First he says it doesn't work, then he say it kinda works? "Global backlight dimming option" sounds like a FALD on/off option, like what the ProArt PA32UC has.


----------



## ToTheSun!

MCridercho said:


> The guy from the Asus forum also said that there is global backlight dimming option available in the OSD. So it does work, or no...? First he says it doesn't work, then he say it kinda works? "Global backlight dimming option" sounds like a FALD on/off option, like what the ProArt PA32UC has.


"Global" might imply it's not granular.


----------



## boredgunner

Scotty99 said:


> Thats ludicrous if true, FALD is one of the main selling points of these things lol.
> 
> I just dont understand i guess, TV's with FALD dont require HDR for FALD to work.


Monitors so often have senseless design choices, to a much greater extent than TVs. I wish I knew why.


----------



## Glerox

The monitor was presented in Beijin today :
http://www.pcpop.com/article/4525092.shtml

20000 chinese yuans = 3200 USD.

Unfortunately, the monitor is ugly as and is designed for teenagers but no teenager can afford it lol.

The Acer has a cleaner look but it seems it will release after the Asus


----------



## CallsignVega

So their slide in that article lists the PA32UC at 21,999 yuan. Which is like $3500 currency conversion. But it sells in the US for $2000. 

I think this monitor will launch at $1,899 USD.

I'm surprised they are advertising the PA32UC in China. They pulled it from the US market due to a defect.


----------



## animeowns

Glerox said:


> The monitor was presented in Beijin today :
> http://www.pcpop.com/article/4525092.shtml
> 
> 20000 chinese yuans = 3200 USD.
> 
> Unfortunately, the monitor is ugly as and is designed for teenagers but no teenager can afford it lol.
> 
> The Acer has a cleaner look but it seems it will release after the Asus


I'm buying pg27uq display as soon as it hits US market even if its $3000


----------



## Scotty99

I was just looking at 4k benchmarks and you are going to need SLI titan v's if you want to max settings in newer titles, even then its no guarantee you are hitting 144 fps lol.

This monitor is releasing about two years earlier than it should be.

Edit, its even worse than that, titan v's apparently do not support SLI


----------



## animeowns

Scotty99 said:


> I was just looking at 4k benchmarks and you are going to need SLI titan v's if you want to max settings in newer titles, even then its no guarantee you are hitting 144 fps lol.
> 
> This monitor is releasing about two years earlier than it should be.
> 
> Edit, its even worse than that, titan v's apparently do not support SLI


new cards are coming this year we will have at least one more titan before the year ends titan t for turing that will be the 16gb gddr6 card the 1180 coming out will only be 8gb


----------



## Scotty99

That is true i guess, but i dont think these new cards from nvidia are going to be the big jump that pascal was over maxwell.

Like do people really think the 1170 is going to match a 1080ti? I cant see that happening again, not for a while.


----------



## kot0005

Glerox said:


> The monitor was presented in Beijin today :
> http://www.pcpop.com/article/4525092.shtml
> 
> 20000 chinese yuans = 3200 USD.
> 
> Unfortunately, the monitor is ugly as and is designed for teenagers but no teenager can afford it lol.
> 
> The Acer has a cleaner look but it seems it will release after the Asus


I think i will be buying the Philips momentum 4k HDR1000. Its only $975. I will Just use my PG279Q for G-sync. Hopefully Samsung or LG release better G-sync HDR monitors under $2k.

I bet they will release revisions of this monitor in Q4 2019 with thinner bezels, better FALD, more than that crap 90% DCI-P3 and 32inch models at cheaper price point.


----------



## toncij

CallsignVega said:


> So their slide in that article lists the PA32UC at 21,999 yuan. Which is like $3500 currency conversion. But it sells in the US for $2000.
> 
> I think this monitor will launch at $1,899 USD.
> 
> I'm surprised they are advertising the PA32UC in China. They pulled it from the US market due to a defect.


What defect? :O


----------



## kot0005

toncij said:


> What defect? :O


 read reviews on their site.


----------



## rvectors

toncij said:


> What defect? :O



Here's just the first user review i found


- The 384 Zone Fald does not work. Dynamic dimming has almost a full second lag before the bright spot dissipates. If you move your mouse around a dark color background, the monitor displays a medium bright comet of lit zone cells behind the movement

- Bright objects have bleeding auras around them

- Brightness - this has to be the lowest brightness monitor I've seen in 3840x2160
Its dim, with contrast corrected and brightness set at 100

- Text- is a mess due to some strange contrast ratios than cant be resolved unless contrast is set to zero, brightness at max, and then the monitor is too dark to use, but the contrast and fonts made more sense.

- Color accuracy- easy, there isnt any. Worst Ive seen not counting some tube monitor from the mid 90s

- Coil whine from built in power supply - a plenty. High pitched whistle.

- Dim corners
Monitor seems like a vignette setting in a video game

- Strobing from led - the setting gives you choices of 24-900+. But it does only 120 and the vibration is quite noticeable

- HDR - Hdr1 did seem to make some elements eye watery bright while obscuring any and all surrounding detail. Hdr2 was overbright with even more blackout on surrounding detail. The monitor unsold any curiosity for hdr



====================================

I don't have a lot of faith that the 144Hz model will do any better.



I haven't gone through every page, so I'm sure it's been discussed but including all the above FALD related issues, brightness, halo etc, the following seems likely.



FALD is disabled for SDR content
Full chroma only supported at 98Hz or near there, otherwise half.
It's not a true 10bit panel but 8+FRC.


Is that what other people are expecting, apart from the usual chunk QA?


----------



## toncij

That definitely looks like either the same problem waiting on PG27UQ or a thing they've been fixing for a full year of delay now.


----------



## CallsignVega

Ya and I'm pretty sure the 32" FALD panel in the PA32UC is going to be the one they are using later this year for the 32" G-Sync version. Basically these FALD monitors are a hot mess so far.


----------



## Sinddk

Can anyone explain a moron like me how they cannot seem to make FALD work on monitors but it works great on TV's and have done now for what seems quite a long period of time?


----------



## Scotty99

Sinddk said:


> Can anyone explain a moron like me how they cannot seem to make FALD work on monitors but it works great on TV's and have done now for what seems quite a long period of time?


I actually have no idea, and ive been researching fald since 2016 when the vizio p series brought fald to the mainstream. There are no technical limitations that i know of as to why these monitors cannot have an active fald in sdr mode, only thing i can think of is their fald implementation is so bad the only way they can get a good picture out of it is with an HDR source.


----------



## Sancus

Sinddk said:


> Can anyone explain a moron like me how they cannot seem to make FALD work on monitors but it works great on TV's and have done now for what seems quite a long period of time?


The misconception is that FALD is a solved problem on TVs. It's not and even the best implementations are incredibly poor compared to emissive displays. The best, highest-density production FALD to date, the Sony Z9D, had ~650 zones on a 65" screen. That gives you a zone size of around 2.77 square inches. Also, I've never seen a TV FALD that has any better latency than the Asus ProArt32UC or the Dell UP2718Q, so we are talking 500+ milliseconds for the FALD to change brightness.

AUO/Nvidia set out to make a display that reaches much, much higher standards than any TV. At 384 zones on 27", the zone size is about 0.81 square inches..that is almost 3.5x more dense than the densest TV made to date. Not to mention, they targeted 1000 nits to produce true HDR. Only the best FALD TVs meet or exceed 1000 nits. The cheap Vizio P-series struggles to reach 500 nits, much dimmer than even 2016 OLEDs -- not acceptable for good quality HDR. They also wanted to do it with much lower latency, IIRC they were targeting <50ms, but in any case, much lower than the 500ms-1000ms typical for FALDs.

Getting your FALD to respond quickly, over a wide range of brightness, while avoiding leakage between zones, at an unprecedented density for a local dimming panel, turned out to be both challenging and very expensive. And we still don't know if they achieved it. If they did, even at ~3000 USD, it's still a pretty big step forward for LCDs.

Basically it's one thing to make a FALD that looks decent enough with TV shows and movies, but if you want one that will actually follow along with an FPS gamer's viewport and mouse movements on PC, and still meet the highest HDR standards, it's quite a difficult thing to build. TVs do not even come close and Nvidia is perfectly aware of that and that's why they set their specs to meet what PC gamers would demand, unfortunately that turned out to be harder to achieve than they expected.

All these factors are why I personally don't believe backlit displays will ever be competitive with emissive displays for applications that demand fast response times.


----------



## Scotty99

Ive never heard of FALD latency until i started posting in this thread, he is only guessing that is the reason why FALD is hard to do on PC monitors. Not only have i never heard of it, ive never heard anyone who uses a FALD tv as a gaming display complain the fald cant keep up. Most people in this thread incorrectly compare FALD to OLED, FALD isnt meant to compete with OLED the prices should already clue you into this lol. (1200 bucks for a 65" vizio p series, LG's 65" OLED is 2600 dollars)

Side note, 2018 vizio P series hits over 1000 nits: (which is a good ~200 its above what LG B7 can currently muster)
https://finance.yahoo.com/news/vizios-2018-p-series-4k-120500331.html


----------



## Sancus

Scotty99 said:


> Ive never heard of FALD latency until i started posting in this thread, he is only guessing that is the reason why FALD is hard to do on PC


Yes, we know you're an LCD apologist. I am deeply sorry I took you off ignore, back you go. You may not think latency matters and it may not be noticeable to you, fortunately Nvidia knows better than you.


----------



## Scotty99

I don't even game on my TV, i am merely making a point that ive never heard of fald latency nor ever heard of people who do game on their TV's complain about it. I actually think you are making terms up at this point, everyone do a google search of fald latency the only things you will find are when someone talks about the TV's response time in an article they are also talking about a FALD set.

OLED is clearly the superior tech, its not my fault you dont understand FALD's place in the market or why it exists.


----------



## rvectors

Well this is AUO we're talking about. If they weren't following today's mass production values, profit over quality, with a disregard for consumers, then they would find a way to implement these technologies well. There's no use debating or detailing the reasons, as most of us here know what has been happening to the monitor market, and already commented many times.


----------



## MCridercho

Samsung Q9FN has the best implementation of FALD on a TV so far. Maybe Samsung should jump on the wagon and show Asus & Acer how it's done.


----------



## Sancus

The Asus rep recanted about the FALD being disabled in SDR mode at least. Apparently he got that wrong. So at least your $3000 monitor WILL use its most important feature most of the time...still need a proper publication to review this thing and see if they did manage to resolve all the issues.


----------



## CallsignVega

It will be interesting to see how the reduced chroma at the higher refresh rates affects image quality.


----------



## Scotty99

Like i said, i knew it couldn't be true.


----------



## ToTheSun!

CallsignVega said:


> It will be interesting to see how the reduced chroma at the higher refresh rates affects image quality.


Considering that most people are perfectly fine with 1440p, some loss of pixel differentiation at 4K will probably go unnoticed to most.


----------



## bee144

ToTheSun! said:


> CallsignVega said:
> 
> 
> 
> It will be interesting to see how the reduced chroma at the higher refresh rates affects image quality.
> 
> 
> 
> Considering that most people are perfectly fine with 1440p, some loss of pixel differentiation at 4K will probably go unnoticed to most.
Click to expand...

I’d rather play at 98 Hz 4:4:4 than 144Hz because my GPUs will probably just barely be able to hit 100Hz. Plus the difference between 60->100Hz is more noticeable than 100-144Hz.


----------



## Swaggerfeld

bee144 said:


> I’d rather play at 98 Hz 4:4:4 than 144Hz because my GPUs will probably just barely be able to hit 100Hz. Plus the difference between 60->100Hz is more noticeable than 100-144Hz.


For me, it's about strobing and motion-blur reduction.... 98hz is technically below the threshold to support strobing (but I believe I have read previously that 100hz would support strobing?) albeit very close. Here's hoping this can run 4:4:4 @ 98hz with motion-blur reduction.


----------



## rvectors

If you read that rog.asus link, apparently the criticism for the PA32UC, is unjustified. He was responding to a link of my post here and probably generally on this thread but did he actually read the posts? 

The PA32UC is meant to be a professional display, I would imagine COLOUR ACCURACY, PANEL UNIFORMITY, CLARITY, BRIGHTNESS etc, i.e those items highlighted, would be very important... but apparently we the fickle demanding consumer, are all wrong on that count.


GIGO -> Garbage in Garbage out


----------



## Sancus

rvectors said:


> If you read that rog.asus link, apparently the criticism for the PA32UC, is unjustified. He was responding to a link of my post and also generally but did he actually read the posts?
> 
> The PA32UC is meant to be a professional display, I would imagine COLOUR ACCURACY, PANEL UNIFORMITY, CLARITY, BRIGHTNESS etc, i.e those items highlighted, would be very important... but apparently we the fickle demanding consumer, are all wrong on that count.


I mean I'm not defending the PA32UC, it definitely has issues. I don't think FALD lag is really one of them. On a professional monitor, that's not very important. And it's obviously something that's very difficult to eliminate, given it hasn't been eliminated by any TV or monitor to date that we know of. The color accuracy and other issues are definitely relevant.

But you[the OP of your linked post sorry, I don't mean you specifically] are talking to an Asus rep. Of course if you link or mention poor reviews with a bunch of negative bullet points he's going to say it's unjustified and defend his company's product. It's really best not to get into some kind of quality argument, just ask direct questions about 1 or 2 specific issues or features and hope you get a response, then move on.

A real critical analysis of the display requires a proper, third party review anyway, you probably aren't gonna have any luck trying to interrogate an Asus rep on potential quality issues for a product that is barely released in limited regions. Any negative reviews he confirms could potentially get him in trouble.


----------



## kot0005

I dont get how people say "FALD is not competing against OLED" What else is FALD for ? Just to give better HDR ? What else gives you better HDR other than OLED currently ? So FALD is indeed competing indirectly, by offering a cheaper alternative by mimicking contrast level with FALD on . No ?

FALD is also one of the key selling points of the new Pro art. So doesn't matter if its a professional monitor. I woudnt buy it if it isn't going to work right..


----------



## NewType88

Man, it seems like from all the skepticism these are going to be a bust. Maybe they will get a nice price cut quickly thereafter....silver lining perhaps?


----------



## rvectors

Sancus said:


> I mean I'm not defending the PA32UC, it definitely has issues. I don't think FALD lag is really one of them. On a professional monitor, that's not very important. And it's obviously something that's very difficult to eliminate, given it hasn't been eliminated by any TV or monitor to date that we know of. The color accuracy and other issues are definitely relevant.
> 
> But you[the OP of your linked post sorry, I don't mean you specifically] are talking to an Asus rep. Of course if you link or mention poor reviews with a bunch of negative bullet points he's going to say it's unjustified and defend his company's product. It's really best not to get into some kind of quality argument, just ask direct questions about 1 or 2 specific issues or features and hope you get a response, then move on.
> 
> A real critical analysis of the display requires a proper, third party review anyway, you probably aren't gonna have any luck trying to interrogate an Asus rep on potential quality issues for a product that is barely released in limited regions. Any negative reviews he confirms could potentially get him in trouble.




I don't disagree but I meant the post on here that was linked into the discussion on the other thread. I wasn't actually engaging with the rep on the other site, just pointing out the numbskullery of an ASUS rep dismissing those faults to features, specifically sort after in a Pro monitor, as unjustified.


----------



## Sancus

Ya sorry, there was some confusion my part. But I also agree that Asus' communication about issues has been poor. AFAIK we still have no idea for what particular reason they seem to have pulled the ProArt from the US market, if they plan to fix any issues, or what is going on. They are still showing it off at shows, along with the PQ22UC. Honestly I was surprised to find someone from Asus answering ANY questions about the PG27UQ directly on their forums...


----------



## animeowns

Asus PG27UQ pre order up on french website with a ship date of may 7th, 2018

https://www.materiel.net/ecran-lcd/asus-rog-swift-pg27uq-152656.html


----------



## sblantipodi

animeowns said:


> Asus PG27UQ pre order up on french website with a ship date of may 7th, 2018
> 
> https://www.materiel.net/ecran-lcd/asus-rog-swift-pg27uq-152656.html


2500€? I'm not an idiot.


----------



## animeowns

sblantipodi said:


> 2500€? I'm not an idiot.


Ya I posted that for people in the European region I am waiting for the US market release


----------



## sblantipodi

animeowns said:


> Ya I posted that for people in the European region I am waiting for the US market release


I'm from Italy and it's the same. I think that 2500€ or 2500USD is an idiot price.


----------



## animeowns

sblantipodi said:


> I'm from Italy and it's the same. I think that 2500€ or 2500USD is an idiot price.


oh so it will be below $3000 for US market I'm ready to pay


----------



## sblantipodi

animeowns said:


> oh so it will be below $3000 for US market I'm ready to pay


people like you helped Apple to be what it is right now


----------



## animeowns

sblantipodi said:


> people like you helped Apple to be what it is right now


If I bought an Iphone X at that price point of $999 I must be high lol if they sold it at $300 I'd buy one then 

hey I bought the dell 8k monitor last year now that was a mistake at $5000 it was all in the name of science someone has to push this tech forward it is us early adopters and its worth about $3500 right now but I was able to sale it for $5k after I found out how hard it was to run 8k I did have some games running at over 60 fps in 8k but in order to achieve this you would need 3 1080 ti's or titan xp's so hopefully with something like gddr6 16gb cards this year we should be able to push 4k 144hz without much trouble. The way I look at it is 4k 144hz stay on this until we get 5k 100hz or 8k 100hz and that will be awhile so I won't be upgrading anytime soon from this display


----------



## CallsignVega

You sold a used Dell 8K for $5K? Man where are those type of buyers when I'm selling stuff lol.

I never messed around with the 8K, as I knew 275 ppi was totally ridiculous for a gaming monitor. Heck, even the 163 ppi of this Asus is a bit much for gaming. The sweet spot is still 32" 4K.


----------



## l88bastar

animeowns said:


> I bought the dell 8k monitor last year but I was able to sale it for $5k


----------



## NewType88

sblantipodi said:


> I'm from Italy and it's the same. I think that 2500€ or 2500USD is an idiot price.


Do all euro prices online include vat ?


----------



## animeowns

CallsignVega said:


> You sold a used Dell 8K for $5K? Man where are those type of buyers when I'm selling stuff lol.
> 
> I never messed around with the 8K, as I knew 275 ppi was totally ridiculous for a gaming monitor. Heck, even the 163 ppi of this Asus is a bit much for gaming. The sweet spot is still 32" 4K.


yeah I sold it last year to a buyer in Sweden. He actually still has it now but asked me recently if I wanted it back lol at a lower price he is using a ultrawide alienware now


----------



## kot0005

well i am convinmced that Nvidia and asus/acer/aoc are pulling a Titan with these displays.

Just wait for 2019 and get OLED's or 2 more years for real HDR 144hz monitors. By that time there will be plenty of games/GPU's.


----------



## sblantipodi

kot0005 said:


> well i am convinmced that Nvidia and asus/acer/aoc are pulling a Titan with these displays.
> 
> Just wait for 2019 and get OLED's or 2 more years for real HDR 144hz monitors. By that time there will be plenty of games/GPU's.


no sane user wants OLED for PC use due to burn in problems.
A 60Hz, 4K, HDR, GSYNC without FALD and a good brightness at nearly 1000 USD could be enough for now.


----------



## ryan92084

sblantipodi said:


> no sane user wants OLED for PC use due to burn in problems.
> A 60Hz, 4K, HDR, GSYNC without FALD and a good brightness at nearly 1000 USD could be enough for now.


At least now I know I'm insane and no longer have to question it.


----------



## sblantipodi

ryan92084 said:


> At least now I know I'm insane and no longer have to question it.


probably you are just ignorant and don't know what burn in is


----------



## ryan92084

sblantipodi said:


> probably you are just ignorant and don't know what burn in is


Nope, has to be the insanity.


----------



## sblantipodi

ryan92084 said:


> Nope, has to be the insanity.


ok I don't argue if you are sure on it


----------



## toncij

I was fiddling with the idea of buying those 8Ks now, but since I need 2 monitors at minimum, thats $7k, plus 2 high-end cards you can't SLI (SLI can't work with monitors in both cards). I simply love my 5Ks (can't find anywhere to buy a 3rd UP2715K). Still, [email protected] could be usable on 32"... on 27" it's not dense enough nor large enough... it's some kind of a cursed middle-ground.


----------



## Sancus

sblantipodi said:


> probably you are just ignorant and don't know what burn in is


I think people post this just to be contrary. We don't really need another argument about this and yet more posts from Vega and others who use OLED daily for PC use without any problems.


----------



## kot0005

ITS NOT 4K HDR 144hz!!! ITS 4k HDR 98Hz or 95Hz because you dont get a 98Hz option in NV control panel.

Also cant Microsoft just implement a software fix using windows for OLED's ? just like Samsung does. Move the whole screen by a pixel to left/right every few seconds.


----------



## Exilon

I'm reading the TFTCentral UP2718Q review and saw that its 384-zone FALD array had a rise time of 600ms and a fall time of 200ms. 

Nice... 

I'll go warm up the popcorn for this monitor. When is it coming out?


----------



## ToTheSun!

kot0005 said:


> ITS NOT 4K HDR 144hz!!! ITS 4k HDR 98Hz or 95Hz because you dont get a 98Hz option in NV control panel.


You can create a custom resolution at 98 Hz in the NVCP.


----------



## ryan92084

kot0005 said:


> ITS NOT 4K HDR 144hz!!! ITS 4k HDR 98Hz or 95Hz because you dont get a 98Hz option in NV control panel.
> 
> Also cant Microsoft just implement a software fix using windows for OLED's ? just like Samsung does. Move the whole screen by a pixel to left/right every few seconds.


I think you mean LG which uses the pixel shift, a refresh program when off, aging compensation (not the right name), the new Logo dimming, etc. I don't know if they would have to be OS or panel implementations to work but until we actually see OLED monitors generally available I doubt MS would bother.


----------



## kot0005

People with Two Titan Xp's and Titan V's defending the pricing of this monitor on the rog forums.


----------



## bee144

kot0005 said:


> People with Two Titan Xp's and Titan V's defending the pricing of this monitor on the rog forums. /forum/images/smilies/confused.gif


I wasn’t defending the price of the monitor. Not sure why you feel the need to blow everything out of proportion.

If 3k offends you, then why continue visiting these threads?


----------



## Sancus

I have to lol @ anyone who is using a Titan V for gaming, it has stutter problems in many games because the drivers aren't tuned for gaming. Worse card than a 1080 TI for practical use. People with more money than brains.


----------



## kot0005

bee144 said:


> I wasn’t defending the price of the monitor. Not sure why you feel the need to blow everything out of proportion.
> 
> If 3k offends you, then why continue visiting these threads?


Because when I bought a PG279q I was disgusted by how poor the quality was and I have been waiting for over 2 years for this monitor only too specs downgraded, have multiple issues and a absurd pricing. You have the monies so you are in a different world and everything will sound out of proportion to you. You shud just ignore me. This doesn't even concern you. Just move along. This thread not restricted to people who cant afford to pay 3k. 



Sancus said:


> I have to lol @ anyone who is using a Titan V for gaming, it has stutter problems in many games because the drivers aren't tuned for gaming. Worse card than a 1080 TI for practical use. People with more money than brains.


The dude didn't even understand that he cant do freesync on this monitor using his Xbox one X..

Not offending anyone but, Most rich people dont have time to research, they are always working or busy making money. They dont care about researching into stuff because they can just buy anything. I bet the next version will be OLED with 200hz G-sync etc and will cost 10k easily at the rate they are increasing these prices.


----------



## toncij

kot0005 said:


> ITS NOT 4K HDR 144hz!!! ITS 4k HDR 98Hz or 95Hz


This is some new info? What about actual release date?


----------



## ThrashZone

Hi,
144Hz isn't listed as a native resolution option but is under PC 
So is 98 or 95Hz not listed under PC either that would be weird if not :/


----------



## kot0005

toncij said:


> This is some new info? What about actual release date?





ThrashZone said:


> Hi,
> 144Hz isn't listed as a native resolution option but is under PC
> So is 98 or 95Hz not listed under PC either that would be weird if not :/


98Hz for HDR, I was wrong about 95hz. Apparently you can select 98Hz in NV control panel according to ROG rep. 

Its not new info. read previous comments..


----------



## ThrashZone

Hi,
I did a little but was just trying to narrow down the stinky part of the clash 
It can't be both
It will either show a higher Hz or not 

If it does show a higher than 60-59Hz listing I really don't see the argument


----------



## Glerox

According to a Asus rep,

It will be limited to 98Hz if you want 4K HDR 10bits 4:4:4 chroma.

144hz will be 4k 8bit SDR 4:4:4 chroma or
144hz 4K 10bits HDR 4:2:2 chroma which is perfect for games.

You won't use HDR on desktop aneways.
UHD blu-ray movies in HDR10 are already 10bits 4:2:2 chroma


----------



## animeowns

if you all have been following the asus pg27uq thread on the rog website asus will have an announcement on Tuesday May 8th hopefully it is the launch of the PG27UQ.


----------



## Malinkadink

Glerox said:


> According to a Asus rep,
> 
> It will be limited to 98Hz if you want 4K HDR 10bits 4:4:4 chroma.
> 
> 144hz will be 4k 8bit SDR 4:4:4 chroma or
> 144hz 4K 10bits HDR 4:2:2 chroma which is perfect for games.
> 
> You won't use HDR on desktop aneways.
> UHD blu-ray movies in HDR10 are already 10bits 4:2:2 chroma


I really don't even care about HDR, i have an OLED C7 and i've seen HDR and DV on it and it doesn't blow me away compared to SDR content, everything looks great on the OLEDs. That said I'd rather have one of these 4k 144hz monitors without HDR if it meant it cost $1k instead of $3k. Give me the typical 350 nits 1000:1 contrast ratio IPS 4k 144hz and i'll pay $1k for it. Sure as hell wont be spending $3k for this just because of HDR and Gsync, also if the HDR doesn't really impress me as it is on an OLED no less, then i can't imagine how i'd feel about it on a crummy LCD lol.


----------



## MistaSparkul

Malinkadink said:


> I really don't even care about HDR, i have an OLED C7 and i've seen HDR and DV on it and it doesn't blow me away compared to SDR content, everything looks great on the OLEDs. That said I'd rather have one of these 4k 144hz monitors without HDR if it meant it cost $1k instead of $3k. Give me the typical 350 nits 1000:1 contrast ratio IPS 4k 144hz and i'll pay $1k for it. Sure as hell wont be spending $3k for this just because of HDR and Gsync, also if the HDR doesn't really impress me as it is on an OLED no less, then i can't imagine how i'd feel about it on a crummy LCD lol.


It could be down to how the content is mastered or maybe the player you were using, I'm not sure. However I do agree with you though as every HDR content I've seen so far ranging from Amazon Prime Video to PS4 Pro games like God Of War and Horizon Zero Dawn has not blown me away at all. In fact I'd say it's more of the opposite as I just felt like all those "bright highlight details" are too bright even on an OLED which supposely has a brightness problem for HDR content. I found myself squinting at my screen whenever a bright highlight whatever showed up. And yes I've tried the 'Dark Room' HDR mode.


----------



## Malinkadink

MistaSparkul said:


> It could be down to how the content is mastered or maybe the player you were using, I'm not sure. However I do agree with you though as every HDR content I've seen so far ranging from Amazon Prime Video to PS4 Pro games like God Of War and Horizon Zero Dawn has not blown me away at all. In fact I'd say it's more of the opposite as I just felt like all those "bright highlight details" are too bright even on an OLED which supposely has a brightness problem for HDR content. I found myself squinting at my screen whenever a bright highlight showed up. And yes I've tried the 'Dark Room' HDR mode.


Same experience for me in regards to brightness. OLEDs get plenty bright for HDR highlights, and your eyes will perceive it even brighter due to OLEDs infinite contrast. There is absolutely no reason i need to see highlights at 500+ nits. I'm absolutely fine with having a constant 100 nits for all my content in a dark room. Couple it with how HDR implementation does indeed vary from media to media with some getting it absolutely horribly wrong and others doing it right, and i may as well not even bother and just stick to the SDR version if possible as then i can avoid blinding lights or poorly mastered content. 

Recently started Monster hunter world on the ps4 and initially i tried it out on hdr, and knew going in it had a poorly implemented HDR mode and it was really obvious. I took a photo on my phone standing still in one area with HDR on and then off and with it off the colors were better and it didn't look washed out anymore. Now with horizon zero dawn hdr on there is pretty good, but i reckon i'd be fine playing the game in SDR.


----------



## CallsignVega

You guys are crazy. HDR is to picture quality as VRR is to fast paced/demanding gaming. Both truly pushed their respective areas forward tremendously.


----------



## boredgunner

CallsignVega said:


> You guys are crazy. HDR is to picture quality as VRR is to fast paced/demanding gaming. Both truly pushed their respective areas forward tremendously.


HDR depends on the implementation within the content though. I do lack experience with HDR gaming, but a friend of mine has an LG C6 OLED and has tried most HDR PC games and says enabling HDR in-game makes most of them look worse, with the exception being Resident Evil 7 (though he hasn't tried Shadow Warrior 2 nor Obduction).


----------



## l88bastar

boredgunner said:


> HDR depends on the implementation within the content though. I do lack experience with HDR gaming, but a friend of mine has an LG C6 OLED and has tried most HDR PC games and says enabling HDR in-game makes most of them look worse, with the exception being Resident Evil 7 (though he hasn't tried Shadow Warrior 2 nor Obduction).


I have a C6 and C7, the C6 sucks with HDR.

C7 Much MOAR BETTAH


----------



## MistaSparkul

I definitely appreciate the wider color gamut of HDR but because I game in a dark room and normally use all my displays at 100 nits, the added brightness from HDR isn't as easy on my eyes.


----------



## boredgunner

l88bastar said:


> I have a C6 and C7, the C6 sucks with HDR.
> 
> C7 Much MOAR BETTAH


Surprising but definitely good to know. Why do you think the C6 sucks with HDR?


----------



## l88bastar

The C6s came out before HDR started to become all the rage and were not really designed for it.

The C6's are much dimmer than the C7s too. Brightness is a big part of HDRs effectiveness and the C6s are lacking there.

With that said, the C6 is still a great TV, just not as great as the C7.

The C8s are even better, but I will not upgrade until we get 4k120 OLED which should be the C9s....hopefully


----------



## Jbravo33

Here they come! My V is lonely. May get a 2nd for dual Xp rig. 

https://overclock3d.net/news/gpu_di...on_as_two_weeks_from_now_from_asus_and_acer/1


----------



## Sancus

l88bastar said:


> The C6s came out before HDR started to become all the rage and were not really designed for it.
> 
> The C6's are much dimmer than the C7s too. Brightness is a big part of HDRs effectiveness and the C6s are lacking there.


Agreed.. 10% peak window on the C6 is 650 nits, 733 nits on the C7, and a whopping 907 nits on the C8(ref). I personally can't wait for the C9s either. If they have true 1000+ nit highlight brightness and 4k 120hz they will be absolutely insane displays.


----------



## Glerox

l88bastar said:


> I have a C6 and C7, the C6 sucks with HDR.
> 
> C7 Much MOAR BETTAH


I would not trade my E6 for anything. HDR is really good in a dark room. But I agree it's not really bright in daylight.
However, this is the last model able to do 3D and the passive 3D on it is insane.
A lot of movies are still releasing in 3D so being able to do 4K HDR and 3D is really a plus for me.

Aneways it's my 2 cents and it's off topic.

Also waiting for 4k HDR monitors!


----------



## Scotty99

Pricing is still up in the air lol?


----------



## ZealotKi11er

Sancus said:


> Agreed.. 10% peak window on the C6 is 650 nits, 733 nits on the C7, and a whopping 907 nits on the C8(ref). I personally can't wait for the C9s either. If they have true 1000+ nit highlight brightness and 4k 120hz they will be absolutely insane displays.


Nice improvements with C8. I do find my C7 to be pretty good with HDR at night. It gets to the point where its actually too bight. I understand the importance of 1000+ nit but what seems like something you would want for day viewing. Another thing I have noticed is that as the screen gets brighter (lights) you kind of lose the infinite contrast from the dark sports simply because how your eyes view lights. Its only in the lower dark scene with not a lot of lights where you can see the effect of OLED.


----------



## l88bastar

Scotty99 said:


> Pricing is still up in the air lol?





post image online


----------



## boredgunner

^ lol


----------



## MistaSparkul

They are STILL tight lipped about the price? Just wow lol.


----------



## bee144

The fact that they're starting to do media previews and not mention price, tells me we might come close to the 3K price. They don't want to spoil the media attention with an awful price. It's probably a great product but all the reviews will focus on the negative price when that leaks.

Be worried!


----------



## Sancus

P.S. delayed again until at least June. (And also a funny aside that there's no way the BFGD is coming out anytime soon, which we already knew).


----------



## Vipu

Delayed again?!?!?


----------



## kot0005

Yes, apparently they have been working on tweaking firmware with NVIDIA's help. Probably having issues with FALD.


----------



## Barefooter

Sancus said:


> P.S. *delayed again until at least June*. (And also a funny aside that there's no way the BFGD is coming out anytime soon, which we already knew).


Yeah... this is code for December


----------



## NewType88

Id rather them just say next summer, then be pleasantly surprised if it comes out earlier. Is there seriously no manufacture out there, that can make a vanilla 4k 144hz g-sync monitor for 1K ? That would be incredible if someone did that out of the blue when these release....sorry, day dreaming.

Does anyone actually prefer the Asus looks compared to the Acer ? I think the Acer one looks sharper, personally.


----------



## Korruptive

All price estimates are around £2500-£3000 by some retailers taking pre-orders.
And it's been delayed AGAIN, they're saying June (2019? lol) now.

I don't even want it anymore, only 98Hz with HDR, after that color bandwidth goes down due to current DP/HDMI Spec limitations.
I'll just wait for reasonably priced mLED monitors with HDMI 2.1.

Bit the bullet and bought an AOC AGON AG271QG.


----------



## pez

Netflix exclusive content has really good HDR implementation. The issue with some HDR content is that much like a bad music producer, production can be bad and they just blow the highlights out of proportion. So you're left with stuff that's just too bright or too distracting. Proper HDR should make colors 'pop'--it becomes especially apparent in darker environments. Daredevil, Dark, and Stranger Things are a few standouts that do HDR to the point it's very enjoyable.


----------



## Sancus

pez said:


> Netflix exclusive content has really good HDR implementation... Daredevil, Dark, and Stranger Things are a few standouts that do HDR to the point it's very enjoyable.


I found the HDR in Altered Carbon to be especially impressive too.


----------



## Clukos

By the time this comes out we'll have 240hz 8K OLED... They announced this waaaay too early, what a mess. At this point I wouldn't be surprised if they just straight up cancel the product.


----------



## animeowns

4k 120hz true 4k 120hz hdr 5ms monitor 43 inch for sale on ebay https://www.ebay.com/itm/253614049402

https://overclock3d.net/news/gpu_di..._manufacturers_-_wasabi_mango_uhd430_real4k/1


----------



## Jbravo33

animeowns said:


> 4k 120hz true 4k 120hz hdr 5ms monitor 43 inch for sale on ebay https://www.ebay.com/itm/253614049402
> 
> https://overclock3d.net/news/gpu_di..._manufacturers_-_wasabi_mango_uhd430_real4k/1


Saw this on oc3d couple days ago. What are your thoughts?


----------



## Glerox

43 inches at 4K is 102 PPI, even worst than 27 inches at 1440p.

I like higher PPI for monitors.


----------



## animeowns

*Acer Predator X27 price listed Canada*

Acer Predator X27 4k 144hz gsync monitor listed on Canadian website for $2799 http://www.canadacomputers.com/produ...item_id=122062


----------



## animeowns

MistaSparkul said:


> They are STILL tight lipped about the price? Just wow lol.


if Acer reveals pricing on may 23rd during there livestream https://www.facebook.com/events/370526756797130/ Asus won't be far behind but it should be priced similar.


----------



## Glerox

animeowns said:


> Acer Predator X27 4k 144hz gsync monitor listed on Canadian website for $2799 http://www.canadacomputers.com/produ...item_id=122062


I made a reservation on it in case it comes out before the ASUS. it looks better too. But I don't care about eye tracker.


----------



## animeowns

Glerox said:


> I made a reservation on it in case it comes out before the ASUS. it looks better too. But I don't care about eye tracker.


according to the asus forums the pg27uq is set to release in june but this can change at anytime.


----------



## Glerox

At canadacomputers they didnt had a date for the Acer but said not before 2 weeks.


----------



## Sancus

Asus is usually ~20% more expensive than Acer so I wouldn't be too surprised if Acer ends up ~$2000 USD and Asus ends up ~$2500 USD. This would mirror the pricing differential on the 1440p 165hz models. Canada Computers is usually more expensive than the strict exchange rate too.


----------



## Exilon

Acer's company (AUO) making the panels anyways. Too bad their frames are so gaudy.


----------



## Scotty99

Someone needs to make a 1440p TN with local dimming 240hz at 24", that is my only upgrade path from this dell i currently own.


----------



## MistaSparkul

Scotty99 said:


> Someone needs to make a 1440p TN with local dimming 240hz at 24", that is my only upgrade path from this dell i currently own.


Then you're probably never going to upgrade. I doubt we will ever see a TN with local dimming because everyone always associates TN with poor image quality so the monitor makers will feel like it is pointless to put local dimming, which is meant to improve picture quality, onto a TN panel.


----------



## Scotty99

MistaSparkul said:


> Then you're probably never going to upgrade. I doubt we will ever see a TN with local dimming because everyone always associates TN with poor image quality so the monitor makers will feel like it is pointless to put local dimming, which is meant to improve picture quality, onto a TN panel.


Nah not the monitor makers, consumers. There is nothing inherent about TN that prohibits image quality, you can make a TN panel look just as good as IPS the only "downside" is viewing angles. Viewing angles are of course incredibly important in TV's, monitors not so much. TN response times put it ahead of IPS as a gaming monitor, its really surprising that someone hasnt made a variation of TN and called it something else so it sells.


----------



## kot0005

My PG279Q stats using VESA HDR https://cdn.discordapp.com/attachments/433917900735250452/444804728929386508/unknown.png


----------



## ToTheSun!

Scotty99 said:


> Nah not the monitor makers, consumers. There is nothing inherent about TN that prohibits image quality, you can make a TN panel look just as good as IPS the only "downside" is viewing angles. Viewing angles are of course incredibly important in TV's, monitors not so much. TN response times put it ahead of IPS as a gaming monitor, its really surprising that someone hasnt made a variation of TN and called it something else so it sells.


It's actually the other way around. Viewing angles are more important for monitors. You never sit close enough to a TV for them to matter. When you're using a monitor, it's almost inescapable that gamma shift will occur visibly. The "only "downside"", as you put it, is, really, its heaviest crux.


----------



## Glerox

So on June 7, 2016, Asus annonced at Computex it was working on a 4K monitor with 144 Hz refresh rate.
We're almost 2 years later lol...

At least I can confirm the Acer's X27 price.
It's on Amazon.ca at 2569$ CAD which is exactly 1999$ USD.

https://www.amazon.ca/gp/product/B07CWDBL39/ref=od_aui_detailpages00?ie=UTF8&psc=1

No shipping date


----------



## Kommando Kodiak

2K is an instant buy from me


----------



## Scotty99

ToTheSun! said:


> It's actually the other way around. Viewing angles are more important for monitors. You never sit close enough to a TV for them to matter. When you're using a monitor, it's almost inescapable that gamma shift will occur visibly. The "only "downside"", as you put it, is, really, its heaviest crux.


Oookkk then lol.


----------



## Scotty99

Kommando Kodiak said:


> 2K is an instant buy from me


Without reading a review or anything eh? Just buy. For two grand you could buy a 55" OLED tv and a 55" TCL HDR/local dimming tv for the bedroom lol.


----------



## ToTheSun!

Scotty99 said:


> Without reading a review or anything eh? Just buy. For two grand you could buy a 55" OLED tv and a 55" TCL HDR/local dimming tv for the bedroom lol.


None of which will have 144 Hz or VRR.


----------



## Scotty99

ToTheSun! said:


> None of which will have 144 Hz or VRR.


I dunno about other people but ill take 1080p 120hz OLED over 4k 144hz LCD any day of the week, and the LG oleds can accept a 120hz input from PC. I was more just trying to show how ridiculously overpriced these things are, its laughable.


----------



## ToTheSun!

Scotty99 said:


> I dunno about other people but ill take 1080p 120hz OLED over 4k 144hz LCD any day of the week, and the LG oleds can accept a 120hz input from PC. I was more just trying to show how ridiculously overpriced these things are, its laughable.


It is pretty pricy, but people buying it have the disposable income to do so and/or already have a nice TV in the living room.

I would never buy this monitor for this amount of money, but I'm no one to even suggest what other people should do with their money.


----------



## Scotty99

So everyone in the market for a 144hz 4k gaming monitor has a OLED in their living room? Ya dunno about that, or your opinion that viewing angles matter more on a monitor than a TV (still not sure if that was a troll or serious lol).

This monitor shouldnt be releasing til 2020 anyways, we have no where near the GPU power to run it properly.


----------



## l88bastar

Scotty99 said:


> So everyone in the market for a 144hz 4k gaming monitor has a OLED in their living room? Ya dunno about that, or your opinion that viewing angles matter more on a monitor than a TV (still not sure if that was a troll or serious lol).
> 
> This monitor shouldnt be releasing til 2020 anyways, we have no where near the GPU power to run it properly.


I have an OLED in my living room and bedroom and am in the market for a 144hz 4K Gaming monitor.

Oh, I also have a Zisworks X39 4k120

Also, my old Titan X Pascal has plenty of GPU power to push 4k120 on even the most modern games. I would rather dial eyecandy down to medium and high on competitive FPS to play 4k120 maxxed...and on single player games I turn the eye candy up and go 60hz


----------



## Scotty99

Thats cool that you are willing to sacrifice image quality for FPS i usually would go the same route, but that does not mean a old pascal titan (or a titan v or 1080ti's) are enough for modern games. 

Even a potato game like overwatch a 1080ti cant even manage 100 fps at 4k ultra:


----------



## Vipu

Scotty99 said:


> or your opinion that viewing angles matter more on a monitor than a TV (still not sure if that was a troll or serious lol).


But that is true, you sit close to monitor so your angle is different all around the screen depending where you look.
TV you usually watch from longer distance so the angle to different spots of the TV is not so big.

Check you screen on this site: http://www.lagom.nl/lcd-test/viewing_angle.php

Does the picture on your screen look like on the left side or right side on this picture:
https://i.imgur.com/Pq3Xnbx.jpg


----------



## Scotty99

Another one lol?

I think you guys are really missing the gist of the conversation here. A typical viewing situation for a TV is going to have multiple seating positions, the loss of color and contrast at an angle for VA tv's is really really bad, nothing even CLOSE to that happens on a TN monitor in a normal viewing position (straight on viewing).

I actually cant believe i have to explain this, internet forums are really frustrating at times.


----------



## ToTheSun!

Scotty99 said:


> So everyone in the market for a 144hz 4k gaming monitor has a OLED in their living room? Ya dunno about that


That's why I wrote "and/or", which precludes the necessity of the following clause being factual.

I actually can't believe i have to explain this, internet forums are really frustrating at times.


----------



## Scotty99

I just don't have time for TN panel haters, besides OLED that is where the industry should have went instead of IPS but because no TV's could ever possibly be TN (that is the reason i made the viewing angle comparison) us gamers are stuck with inferior IPS displays and only a few good TN monitors to choose from.

I think the problem i am running into on this forum is no one actually plays games, they build multi thousand dollar PC's to browse facebook with lol.


----------



## animeowns

Kommando Kodiak said:


> 2K is an instant buy from me


so we are waiting for the US page to go live ya buying one and I'm thinking about grabbing a titan v while Im at it for my 7980xe X299 build


----------



## ocyt

i wonder what the response and input times are for this thing.

what's the point of high refresh rates if it has a high amount of lag?


----------



## ToTheSun!

Scotty99 said:


> I just don't have time for TN panel haters, besides OLED that is where the industry should have went instead of IPS but because no TV's could ever possibly be TN (that is the reason i made the viewing angle comparison) us gamers are stuck with inferior IPS displays and only a few good TN monitors to choose from.
> 
> I think the problem i am running into on this forum is no one actually plays games, they build multi thousand dollar PC's to browse facebook with lol.


People are allowed to hate a technology if they don't like it. I tried a TN monitor before settling on my current one. And I'll tell you, this "IPS" monitor has more defects than the TN Dell I tried; the thing, though, is that I disliked the TN's shortcomings more than I disliked this one's flaws. It's all very subjective, sometimes.

I gave it a fair shot, but it didn't work out. I can also tell you that many users here, who own OLED TV's and are thinking of buying this Asus, have owned (or still do) high refresh rate TN monitors.


----------



## Kommando Kodiak

animeowns said:


> so we are waiting for the US page to go live ya buying one and I'm thinking about grabbing a titan v while Im at it for my 7980xe X299 build


sssssssh youll trigger scotty.


----------



## Vipu

Scotty99 said:


> Another one lol?
> 
> I think you guys are really missing the gist of the conversation here. A typical viewing situation for a TV is going to have multiple seating positions, the loss of color and contrast at an angle for VA tv's is really really bad, nothing even CLOSE to that happens on a TN monitor in a normal viewing position (straight on viewing).
> 
> I actually cant believe i have to explain this, internet forums are really frustrating at times.


You dont really get it do you.
Sure when you watch TV from angle then you watch it from angle so the whole picture is some kind of wrong but at least its pretty consistent.
On pc you look it from all kind of angles because you are so close, you cant watch it "from straight" like you think.

I had to make sure and check again the best TN viewing angle tests to make sure I wasnt wrong and yeah even best TNs still have bad viewing angles.
Guess I have to use paint to explain it and have it for later use for other people to explain it.

So here is the picture, green line showing where you look at and red line being the screen and u see the VIEWING ANGLE ur looking the screen at, so yeah you cant be looking at screen "straight front" unless you go far away enough so the angle stays pretty much same.


----------



## Glerox

It's really time those monitors are in our hands so we'll have something to talk about lol instead of repeating the same debates about panel technology


----------



## Profiled

IPS has better colors than TN!


----------



## CallsignVega

My thought process is that since all LCD is trash, why not get the fastest trash? I use my OLED if I want picture quality. 

If I had my choice for a 144+ Hz gaming LCD for first person shooters, it would be TN due to its speed.

But I don't have a 144+ Hz gaming monitor at the moment seeing as they INSIST on putting that ridiculous crap matte AR film on them. Right now I'm playing PUBG on my C8.


----------



## boredgunner

CallsignVega said:


> My thought process is that since all LCD is trash, why not get the fastest trash? I use my OLED if I want picture quality.
> 
> If I had my choice for a 144+ Hz gaming LCD for first person shooters, it would be TN due to its speed.
> 
> But I don't have a 144+ Hz gaming monitor at the moment seeing as they INSIST on putting that ridiculous crap matte AR film on them. Right now I'm playing PUBG on my C8.


One might prefer the best looking trash instead, like Samsung "QLED". Though once the picture starts moving, it won't be the best looking trash.


----------



## bee144

ocyt said:


> i wonder what the response and input times are for this thing.
> 
> what's the point of high refresh rates if it has a high amount of lag?


Response time is 4ms, which is the best you’ll get for an IPS monitor.


----------



## bee144

Vipu said:


> Scotty99 said:
> 
> 
> 
> Another one lol?
> 
> I think you guys are really missing the gist of the conversation here. A typical viewing situation for a TV is going to have multiple seating positions, the loss of color and contrast at an angle for VA tv's is really really bad, nothing even CLOSE to that happens on a TN monitor in a normal viewing position (straight on viewing).
> 
> I actually cant believe i have to explain this, internet forums are really frustrating at times.
> 
> 
> 
> You dont really get it do you.
> Sure when you watch TV from angle then you watch it from angle so the whole picture is some kind of wrong but at least its pretty consistent.
> On pc you look it from all kind of angles because you are so close, you cant watch it "from straight" like you think.
> 
> I had to make sure and check again the best TN viewing angle tests to make sure I wasnt wrong and yeah even best TNs still have bad viewing angles.
> Guess I have to use paint to explain it and have it for later use for other people to explain it.
> 
> So here is the picture, green line showing where you look at and red line being the screen and u see the VIEWING ANGLE ur looking the screen at, so yeah you cant be looking at screen "straight front" unless you go far away enough so the angle stays pretty much same.
Click to expand...

That is a poor picture. Most usability guides require you to have the top of the monitor at eye level. That way you don’t have to move you’re head, you just move your eyes downward.


----------



## Vipu

bee144 said:


> That is a poor picture. Most usability guides require you to have the top of the monitor at eye level. That way you don’t have to move you’re head, you just move your eyes downward.


True but this wasnt guide for ergonomics, the viewing angles still apply, the lower angle would be just even higher if monitor was on right level.


----------



## Sancus

CallsignVega said:


> My thought process is that since all LCD is trash, why not get the fastest trash? I use my OLED if I want picture quality.


And I take the opposite view, which is that when the difference in average response time between IPS and TN is 5ms vs 3ms, I'd rather avoid the annoying color shift, especially with larger panel sizes like 27" or 32".

It comes down to preference really. Even TN response times are pathetic compared to OLED. In fact I wouldn't be surprised if the motion resolution of the C8 @ 1080p 120hz is substantially better than any LCD currently made outside of strobing modes.


----------



## Scotty99

Response time does not tell the whole story. A 1ms tn vs a 4ms ips is not explained away by 3ms, they feel completely different. Gotta remember its different technologies, a turbo 2.0l 4 cylinder is going to feel a lot different than a 5.0 v8 even tho they produce similar horsepower numbers. 

If you aren't a gamer first and foremost of course you are going to be fine with IPS or VA, but people who play fast paced shooters simply cannot make that sacrifice. Not one single overwatch pro owns a ips or va monitor, it's TN across the board.


----------



## Leopardi

Scotty99 said:


> Response time does not tell the whole story. A 1ms tn vs a 4ms ips is not explained away by 3ms, they feel completely different. Gotta remember its different technologies, a turbo 2.0l 4 cylinder is going to feel a lot different than a 5.0 v8 even tho they produce similar horsepower numbers.
> 
> If you aren't a gamer first and foremost of course you are going to be fine with IPS or VA, but people who play fast paced shooters simply cannot make that sacrifice. Not one single overwatch pro owns a ips or va monitor, it's TN across the board.


It doesn't matter competitively if the monitor IPS or TN (as long as input lag is the same), except for the 240Hz available in some TN monitors.


----------



## Vipu

Scotty99 said:


> Response time does not tell the whole story. A 1ms tn vs a 4ms ips is not explained away by 3ms, they feel completely different. Gotta remember its different technologies, a turbo 2.0l 4 cylinder is going to feel a lot different than a 5.0 v8 even tho they produce similar horsepower numbers.
> 
> If you aren't a gamer first and foremost of course you are going to be fine with IPS or VA, but people who play fast paced shooters simply cannot make that sacrifice. Not one single overwatch pro owns a ips or va monitor, it's TN across the board.


Good we have these detailed tests done by some people, to me (non 240hz) TN seems pretty bad compared to IPS:

http://www.tftcentral.co.uk/images/asus_rog_swift_pg279q/lag.jpg
http://www.tftcentral.co.uk/images/asus_rog_swift_pg279q/pursuit_3a.jpg

http://www.tftcentral.co.uk/images/pixperan/asus_rog_swift_pg278q.jpg
http://www.tftcentral.co.uk/images/pixperan/asus_rog_swift_pg279q.jpg


----------



## Kommando Kodiak

anybody else here care to relate their experiences from tn to ips in terms of response times?


----------



## ToTheSun!

Kommando Kodiak said:


> anybody else here care to relate their experiences from tn to ips in terms of response times?


Keep in mind that different people have varying degrees of sensitivity to many aspects of the visual system.

With that said, I've used a CRT at 100 Hz for many years and, recently, have gone from 1440p165 TN to 1440p165 AHVA. What I can tell you is that I notice the difference in motion clarity if I go look for it. Otherwise, it's fast enough and doesn't impact my experience while gaming (or anything else, really).

I'd say I'm pickier than the average consumer, if that means anything to you.


----------



## Agent-A01

Scotty99 said:


> but people who play fast paced shooters simply cannot make that sacrifice. Not one single overwatch pro owns a ips or va monitor, it's TN across the board.


I'm a competitive gamer (grandmaster in overwatch) and I can tell you IPS type panels such as XB271HU(which i use) is hardly different to TN equivalent.

So that blanket statement is incorrect. They are similar enough that it does not matter


----------



## Glerox

Kommando Kodiak said:


> anybody else here care to relate their experiences from tn to ips in terms of response times?


I changed from a 165hz IPS to a 165hz TN and can definitely tell it's more responsive and has less motion blur. So I prefer it for fast shooters but the difference is really minimal.

You can see here my analysis of 11 monitors and I talk about TN vs IPS :






Nevertheless, [email protected] beats everything in my opinion


----------



## Scotty99

Agent-A01 said:


> I'm a competitive gamer (grandmaster in overwatch) and I can tell you IPS type panels such as XB271HU(which i use) is hardly different to TN equivalent.
> 
> So that blanket statement is incorrect. They are similar enough that it does not matter


Just means you are the exception and aren't perceptive enough to notice/care, IPS is noticeably slower to me. 

I guess i lied, of the hundreds of pro's tracked on this website two have ips monitors:
http://on-winning.com/overwatch-pro-sensitivity-settings-setups-monitor-mouse-keyboard-headset/


----------



## MistaSparkul

I can also see the difference between TN and IPS when both are running sample and hold but thing is, at the end of the day, neither of them comes close to a strobed monitor anyways. If motion clarity is the absolute number one priority then strobing beats any sample and hold monitor, even a 120hz strobed IPS will mop the floor with a 240Hz sample and hold TN.


----------



## Exilon

PG278Q = 2.9ms average G2G

PG279Q = 5.2ms average G2G

Both are fast enough for 144Hz. If you consider yourself in the elite of elites where the 2.3ms difference in pixel transition matters... well more power to you I guess.

Back on topic.






Why is the halo leading the mouse cursor?


----------



## Scotty99

I really am not trying to derail the thread here, it just needs to be understood that you cannot differentiate IPS and TN by with MS ratings, they FEEL different. It would be like comparing MPG on a gas car to whatever they rate electric cars in, they are different animals. If you are the type of person who started life on a TN monitor (most people did) even a crappy run of the mill 5ms TN is going to feel faster than the fastest 4ms ips.


----------



## Scotty99

Id just like to also add that if we can go by that one video FALD on that monitor is just as i expected, total and utter trash. When he is moving his mouse around that menu that looks similar to what my 400 dollar e series with TWELVE zones looks like, vizio P series has 128 zones and destroys these monitors as they currently stand.
https://www.youtube.com/watch?time_continue=20&v=BoOM_cI37N0

Thats not great, but it looks a lot better than what the mouse did in the destiny 2 menu from the above video.

Like ive been saying for a year, the programming behind the FALD is just as or more important as the zone count. These guys are amateurs compared to the guys programming the high end sony and vizio TV's. What they have going for them is FALD can be tweaked/fixed with firmware updates.


----------



## CallsignVega

5000:1 VA panels struggle to hide FALD zones haloing. So as expected, some crappy 1000:1 IPS panel is going to be atrocious at hiding it. Just like my testing with the 27" FALD Dell which was IPS. Algorithms aren't going to defeat physics.


----------



## Scotty99

CallsignVega said:


> 5000:1 VA panels struggle to hide FALD zones haloing. So as expected, some crappy 1000:1 IPS panel is going to be atrocious at hiding it. Just like my testing with the 27" FALD Dell which was IPS. Algorithms aren't going to defeat physics.


Nope the vizio 55" is an IPS and its FALD absolutely demolishes what we are seeing in that destiny 2 menu (with far fewer zones on a much larger screen btw). It has to get better with firmware or it's possible that model isnt a release candidate, either way no one spending 2k+ on this would call that acceptable.


----------



## Derpinheimer

Scotty99 said:


> I really am not trying to derail the thread here, it just needs to be understood that you cannot differentiate IPS and TN by with MS ratings, they FEEL different. It would be like comparing MPG on a gas car to whatever they rate electric cars in, they are different animals. If you are the type of person who started life on a TN monitor (most people did) even a crappy run of the mill 5ms TN is going to feel faster than the fastest 4ms ips.


The ms ratings on the box are, just like with most things, total junk. However, if you find a quality review that measures the response times (like, say, tftcentral) - the ms ratings ARE comparable. TN is not going to feel faster than IPS, inherently. 

There is nothing more to what we see as motion clarity than the refresh rate, response time, and input lag. All of these are measurable. Your link to what competitive gamers use is easily explained if you had paid some attention; the real trend is the lack of high resolution (>1080p) displays. Guess what all high frequency IPS monitors are? 1440p.


----------



## Scotty99

Derpinheimer said:


> The ms ratings on the box are, just like with most things, total junk. However, if you find a quality review that measures the response times (like, say, tftcentral) - the ms ratings ARE comparable. TN is not going to feel faster than IPS, inherently.
> 
> There is nothing more to what we see as motion clarity than the refresh rate, response time, and input lag. All of these are measurable. Your link to what competitive gamers use is easily explained if you had paid some attention; the real trend is the lack of high resolution (>1080p) displays. Guess what all high frequency IPS monitors are? 1440p.


We can agree to disagree on that one, a 4ms ips is going to feel slower (and just "off") to me vs a 4ms tn if one existed. What funny is ive heard people say similar things about OLED, even tho they know its faster something felt off. Not sure what you are getting at with the 1440p ips thing, these pro OW players have more money than all of us lol. (streamers make millions per year, yet they are still rocking TN monitors).

But annnyywwaayys, i cant wait to see rtrings get ahold of this monitor to test its fald vs the top fald TV's.


----------



## CallsignVega

Scotty99 said:


> Nope the vizio 55" is an IPS and its FALD absolutely demolishes what we are seeing in that destiny 2 menu (with far fewer zones on a much larger screen btw). It has to get better with firmware or it's possible that model isnt a release candidate, either way no one spending 2k+ on this would call that acceptable.


Neither the 2017 P series (video you linked) or E series are IPS.


----------



## Scotty99

CallsignVega said:


> Neither the 2017 P series (video you linked) or E series are IPS.


The 55" vizio P from 2016 and 2017 uses an ips panel, and fald works incredibly well on these. Trust me the algorithm has everything to do with FALD, you will see this when rtings gets ahold of this monitor.


----------



## MistaSparkul

According to TFTC, the non FALD panel planned for June production, and there are VA options planned for July and August production. So for those who don't want to pay the higher price for a crappy FALD implementation then looks like we won't be waiting much longer after this releases.


----------



## Sancus

VA panels have their own issues for gaming though. They're fine on TVs because movies and TV shows pan slowly, and even console gamers move their viewports slowly compared to keyboard+mouse gamers(plus TVs are only designed for gaming as a side note, anyway). The XXXtreme 30-50ms transition times on dark colors are a pretty big issue on PC. I am really skeptical that it will even be worth setting the PG35VQ(and other versions) to 200hz as a result. Nobody has released a VA panel that looks any good at 200hz.


That said, I'm still more interested in it than the 27" ones for sure. Assuming they don't expect to get $4000 for it or something insane like that, anyway.


----------



## NewType88

MistaSparkul said:


> According to TFTC, the non FALD panel planned for June production, and there are VA options planned for July and August production. So for those who don't want to pay the higher price for a crappy FALD implementation then looks like we won't be waiting much longer after this releases.


Is that the 32” ips one that’s supposed to come out later in the year, or something else ?


----------



## CallsignVega

Scotty99 said:


> The 55" vizio P from 2016 and 2017 uses an ips panel, and fald works incredibly well on these. Trust me the algorithm has everything to do with FALD, you will see this when rtings gets ahold of this monitor.


Oh I was referring to the video of the FALD you posted of the 2017 P series that Rtings reviewed, that is a 65" VA panel. So obviously the 5000:1 VA panel is going to handle the FALD better. Have a FALD video of the 55" IPS version?

Also of note is the 2017 P series only goes to 500 nits, half as bright as this monitor which would make the haloing even worse in HDR mode.

My point is really bright backlight + IPS panel that is horrible at blocking light is going to cause a LOT of returns on this monitor.


----------



## Scotty99

The 55" IPS P series is 95% of a VA with much better off axis viewing, imho its the best TV vizio makes. You really arent losing much in terms of contrast, i cant find any tests specifically done on the 55" but its miles ahead of the M or E series which are VA panels.

Also a forbes article speculating prices on these monitors:
https://www.forbes.com/sites/antony...launching-this-month-and-you-will-want-one/2/

They guess 3k for the asus and 3600 for the acer lol.


----------



## Glerox

Anyone here waiting for these monitors to play PUBG or fortnite in [email protected] (at low settings)?!

can't wait!!!

HDR is will nice for singleplayer games and the next battlefield


----------



## animeowns

Glerox said:


> Anyone here waiting for these monitors to play PUBG or fortnite in [email protected] (at low settings)?!
> 
> can't wait!!!
> 
> HDR is will nice for singleplayer games and the next battlefield


does pubg and fornite support sli ? if so you can get 144 max settings if you have 3 or 4 1080 ti's or titanxp's laying around to sli with a workaround


----------



## Sancus

animeowns said:


> does pubg and fornite support sli



AFAIK neither do. Pubg did a while ago but it had a lot of issues and didn't offer much performance increase. Unfortunately SLI doesn't work very well if at all in many modern games, developers are largely abandoning it.


----------



## Vipu

Looks like they are coming this month after all http://www.guru3d.com/news-story/nv...a-hd-hdr-monitors-will-launch-this-month.html


----------



## Glerox

animeowns said:


> does pubg and fornite support sli ? if so you can get 144 max settings if you have 3 or 4 1080 ti's or titanxp's laying around to sli with a workaround


No it doesnt support SLI 

But with one Titan XP you can get over 100hz at 4K low settings. I will see the enemies so far away haha 

I hope the next tomb raider and battlefield support both HDR and SLI


----------



## kot0005

Vipu said:


> Looks like they are coming this month after all http://www.guru3d.com/news-story/nv...a-hd-hdr-monitors-will-launch-this-month.html


Guru3d and a lot of others are just posting this new for $$. Displays are made by Asus/Acer. Not nvidia. They already made a post previously saying they would be out in April.


----------



## CallsignVega

Ya not sure why NVIDIA keeps chiming in. Besides selling the G-Sync chip and helping with its firmware, the ball is totally in AUOptronics and Acer/ASUS's court as far as delivery timeline.


----------



## Kommando Kodiak

CallsignVega said:


> Ya not sure why NVIDIA keeps chiming in. Besides selling the G-Sync chip and helping with its firmware, the ball is totally in AUOptronics and Acer/ASUS's court as far as delivery timeline.


Because Nvidia is controlling the G-Sync HDR standard, this isnt just your regular g-sync. So because they control that certification they set the standard that the monitor makes need to meet, this is why we have had delays because upon reviewing the monitors before release they found bugs that were unacceptable that would could hurt their branding. This is in contrast to AMD and Freesync, where there were some bad freesync implementations (I didnt follow freesync so some of you guys chime in with the bad ones) . Nvidia has to approve the implementation before its released its a certification process.


*Remember how much control nvidia wanted with GPP? They have more than that level of control with the G-sync HDR branding.*


----------



## MistaSparkul

Kommando Kodiak said:


> Because Nvidia is controlling the G-Sync HDR standard, this isnt just your regular g-sync. So because they control that certification they set the standard that the monitor makes need to meet, this is why we have had delays because upon reviewing the monitors before release they found bugs that were unacceptable that would could hurt their branding. This is in contrast to AMD and Freesync, where there were some bad freesync implementations (I didnt follow freesync so some of you guys chime in with the bad ones) . Nvidia has to approve the implementation before its released its a certification process.
> 
> 
> *Remember how much control nvidia wanted with GPP? They have more than that level of control with the G-sync HDR branding.*


So apparently all the FALD blooming is acceptable to nvidia then.


----------



## Kommando Kodiak

either its unavoidable or thats what these delays are over; getting the final code for that worked out and bug testing it.


----------



## Scotty99

To be fair tho that typically only happens in menus and other random spots that usually arent that important, same happens on cheap FALD TV's. Its just not something i expected from a 27" monitor with nearly 400 fald zones. This monitor in my eyes is worth 1199.99 tops, but im sure they will sell out for whatever they list them at because 120hz 4k.


----------



## Scotty99

Also off topic but samsung's 2018 LED tv's are hitting 2200 nits in a 10% window, if samsung could ever get a good FALD on one of their TV's i actually think OLED is in trouble.
https://www.rtings.com/tv/reviews/samsung/q7fn


----------



## boredgunner

Scotty99 said:


> Also off topic but samsung's 2018 LED tv's are hitting 2200 nits in a 10% window, if samsung could ever get a good FALD on one of their TV's i actually think OLED is in trouble.
> https://www.rtings.com/tv/reviews/samsung/q7fn


OLED still wouldn't be in trouble because FALD can never feasibly compare to self-emitting pixels, and LCD motion performance can also never compare. OLED will always be desirable by those who can afford it.


----------



## white owl

And those who can't lol


----------



## boredgunner

white owl said:


> And those who can't lol


Ain't that the truth.


----------



## Scotty99

boredgunner said:


> OLED still wouldn't be in trouble because FALD can never feasibly compare to self-emitting pixels, and LCD motion performance can also never compare. OLED will always be desirable by those who can afford it.


We really dont know yet is the thing, HDR is still in its baby stages and we havent seen what real HDR means because that stuff is meant for 10,000+ nit displays. Right now ill take an OLED over that samsung i linked, but who knows in a few years.


----------



## l88bastar

The bizarre sanity of Anti-OLED peeps is mind boggling.

OLED is so far and away superior than LCD tech that it gives my brain a permanent Ice Cream headache.


----------



## boredgunner

Scotty99 said:


> We really dont know yet is the thing, HDR is still in its baby stages and we havent seen what real HDR means because that stuff is meant for 10,000+ nit displays. Right now ill take an OLED over that samsung i linked, but who knows in a few years.


I would say we do know. It is the simple limitation of using hundreds of LEDs to illuminate a screen, vs using one light source for every single pixel (8,294,400 on 4k). HDR is not magic that can bypass this. In theory a 4k LCD screen would need 8,294,400 dimming zones to be competitive in static picture quality, but that is not feasible (they don't even have 1,000). And LCD can never be competitive for motion clarity.


----------



## Scotty99

l88bastar said:


> The bizarre sanity of Anti-OLED peeps is mind boggling.
> 
> OLED is so far and away superior than LCD tech that it gives my brain a permanent Ice Cream headache.


I love oled im just being realistic here about its limits, its not even hitting half the peak brightness of mid range LCD tv's which is very important for the future of HDR. Again no consumer has seen real HDR in action, that is being mastered on displays that no one can buy and are much brighter than anything on the market.


----------



## boredgunner

Scotty99 said:


> I love oled im just being realistic here about its limits, its not even hitting half the peak brightness of mid range LCD tv's which is very important for the future of HDR. Again no consumer has seen real HDR in action, that is being mastered on displays that no one can buy and are much brighter than anything on the market.


I don't think those brightness levels are that important. 1000 lumens is already blinding, why do you need double that?


----------



## l88bastar

Scotty99 said:


> I love oled im just being realistic here about its limits, its not even hitting half the peak brightness of mid range LCD tv's which is very important for the future of HDR. Again no consumer has seen real HDR in action, that is being mastered on displays that no one can buy and are much brighter than anything on the market.


I have a C7, its gorgeous and plenty bright....geez i think I run the thing at 50% brightness level too The C8's raised the bar even further and the C9's should probably hit 1,000 + usher in 4k120. 

How bright do you people need these displays? You want them to sear your eyes lmao


----------



## Scotty99

Dont confuse all screen brightness with what it can achieve in a window, feel free to look up HDR standards if you are confused about why super bright displays are so important.


----------



## Scotty99

boredgunner said:


> I don't think those brightness levels are that important. 1000 lumens is already blinding, why do you need double that?


Dolby vision is mastered at 10,000 nits, again its not about the entire screen brightness just tiny windows for the contrasting highlights.


----------



## boredgunner

Scotty99 said:


> Dolby vision is mastered at 10,000 nits, again its not about the entire screen brightness just tiny windows for the contrasting highlights.


I'm aware of that, so my question still stands. A 1,000 nit small window is still a really bright window. Either way, HDR demands better contrast performance, which LCD can never deliver due to backlighting limitations. HDR doesn't bypass these hardware limits of LCD.


----------



## Scotty99

boredgunner said:


> I'm aware of that, so my question still stands. A 1,000 nit small window is still a really bright window. Either way, HDR demands better contrast performance, which LCD can never deliver due to backlighting limitations. HDR doesn't bypass these hardware limits of LCD.


All i know is samsung sells more TV's than any other manufacturer combined, and by a large margin. If they thought OLED was the future they would have invested billions into the tech like LG has, but they are still on the LCD train. I dont know how good HDR can get the best example ive seen is on a TV that has similar NIT output as a 2018 OLED and i wasnt that impressed. Again what im talking about is years down the line when we will be comparing 1500 nit OLED's vs 5000 nit LCD tv's who knows how big of a difference that is going to be and if its going to be enough to make up for LCD's inadequacies, no one can answer that because no one has seen real HDR content yet.


----------



## boredgunner

Scotty99 said:


> All i know is samsung sells more TV's than any other manufacturer combined, and by a large margin. If they thought OLED was the future they would have invested billions into the tech like LG has, but they are still on the LCD train. I dont know how good HDR can get the best example ive seen is on a TV that has similar NIT output as a 2018 OLED and i wasnt that impressed. Again what im talking about is years down the line when we will be comparing 1500 nit OLED's vs 5000 nit LCD tv's who knows how big of a difference that is going to be and if its going to be enough to make up for LCD's inadequacies, no one can answer that because no one has seen real HDR content yet.


You can't compare an LCD to an OLED just because you saw an LCD with similar brightness levels, lol. That doesn't make sense on any level.

You keep indicating that HDR will magically circumvent LCD's hardware limitations, but there is no magic so such a thing is not possible. So yes, some of this we can answer based on these simple facts.










HDR cannot make up for the haloing caused by FALD because it's a hardware limitation, nor can it make up for LCD's inability to display blacks because it's a hardware limitation. End of that story. Also these brightness levels you mention would be for content in which the camera is looking directly at the sun... not terribly important and there is obviously a point of diminishing returns, and it is far below 5000 nits. Nits are for sure going to be the new selling point to the ignorant.

As for indicating that Samsung's industry leading popularity means they can make no mistakes or ever get eclipsed by someone like LG, let's not pretend any of that is true in the slightest. Not that I think Samsung is making a mistake, because they believe microLED is the future, which I also believe is the case. It should beat OLED in every way.


----------



## Scotty99

You read my comment entirely wrong, i wasnt comparing LCD to OLED merely that the set i watched it on had a similar brightness level to lg's 2018 sets. HDR has an opportunity to be a real game changer in the future but with current sets that isnt possible, how you are not understanding what is being discussed is kind of baffling to be honest. OLED is amazing but it has limitations, unless a major breakthrough happens its not going to take off like everyone (including me) wants it to. Samsung is truly in charge here, we arent going to get affordable OLED tv sets until samsung starts mass producing them.


----------



## boredgunner

Scotty99 said:


> You read my comment entirely wrong, i wasnt comparing LCD to OLED merely that the set i watched it on had a similar brightness level to lg's 2018 sets. HDR has an opportunity to be a real game changer in the future but with current sets that isnt possible, how you are not understanding what is being discussed is kind of baffling to be honest. OLED is amazing but it has limitations, unless a major breakthrough happens its not going to take off like everyone (including me) wants it to. Samsung is truly in charge here, we arent going to get affordable OLED tv sets until samsung starts mass producing them.


I've demonstrated full understanding of everything being discussed by addressing every point you made. OLED has limitations, but LCD has far more. But yes, OLED needs one breakthrough before it can become mainstream, and that is lower costs so someone like Samsung manufacturing panels would be the ticket to that happening. Seems they are working towards microLED though which is fine by me.


----------



## Scotty99

Not trying be a dink but no, you really havent. When you make comments like "these tv's are already bright enough" you kind of show your hand on how you dont fully grasp the potential of HDR. This is why samsung hasnt invested fully into OLED, they looked into their crystal ball and came to the conclusion OLED wont be suitable for upcoming HDR standards. If you watch reaction videos of people's first time watching a well done HDR movie (The Revenant comes to mind) they arent blown away, they just know it looks different. I think samsung knew HDR will be more impactful down the road and decided to focus their efforts on LCD regardless of its limitations. I used to say samsung will have to cave to OLED eventually but if year after year after year LCD increases its brightness lead im not so sure anymore.


----------



## Excession

Scotty99 said:


> When you make comments like "these tv's are already bright enough" you kind of show your hand on how you dont fully grasp the potential of HDR.


They are perfectly capable of reaching brightnesses which are actively uncomfortable to look at. I'm not sure why they need to go much further unless the potential that you're talking about is for televisions that can do double-duty as floodlights.


----------



## MistaSparkul

Scotty99 said:


> Not trying be a dink but no, you really havent. When you make comments like "these tv's are already bright enough" you kind of show your hand on how you dont fully grasp the potential of HDR. This is why samsung hasnt invested fully into OLED, they looked into their crystal ball and came to the conclusion OLED wont be suitable for upcoming HDR standards. If you watch reaction videos of people's first time watching a well done HDR movie (The Revenant comes to mind) they arent blown away, they just know it looks different. I think samsung knew HDR will be more impactful down the road and decided to focus their efforts on LCD regardless of its limitations. I used to say samsung will have to cave to OLED eventually but if year after year after year LCD increases its brightness lead im not so sure anymore.


If I already can't stand the HDR brightness on my OLED TV and phone, then how is having even more brightness suppose to give me a better experience? Can you honestly tell me that watching say, youtube HDR on a phone like a Galaxy S9+ doesn't sear your eyeballs when viewing in a dark room?


----------



## Scotty99

Local dimming simply needs to get better. I know what you guys are saying, even my entry level E series can get bright enough to be uncomfortable in a dark room but an advanced dimming system will be able to control and fine tune the display a lot better than whats currently out there on the market.


----------



## Jbravo33

This the Samsung thread?


----------



## Sancus

Samsung bailed on OLED because they couldn't make their OLEDs cost effective at TV sizes, and couldn't duplicate LG's architecture due to patents. No other reason. They've been playing catch-up in the TV market ever since.


----------



## CallsignVega

Scotty99 said:


> You read my comment entirely wrong, i wasnt comparing LCD to OLED merely that the set i watched it on had a similar brightness level to lg's 2018 sets. HDR has an opportunity to be a real game changer in the future but with current sets that isnt possible, how you are not understanding what is being discussed is kind of baffling to be honest. OLED is amazing but it has limitations, unless a major breakthrough happens its not going to take off like everyone (including me) wants it to. Samsung is truly in charge here, we arent going to get affordable OLED tv sets until samsung starts mass producing them.


Samsung in charge? lol they are scrambling:

https://www.theverge.com/2018/4/26/17283994/lg-oled-tv-sales-q1-2018-earnings

Some industry insiders think OLED will have 90% of the premium TV market within 3 years. In 2017, Samsung went completely edge lit which ended up being a disaster. They are in total scramble mode and they put all of their efforts into marketing and also mLED which is still 5+ years away.

I think anyone who buys a Samsung LCD over an OLED is clueless. Their top set (Q9FN) is EIGHTH place on Rtings picture quality chart. That's pretty low.


----------



## xrodney

CallsignVega said:


> Samsung in charge? lol they are scrambling:
> 
> https://www.theverge.com/2018/4/26/17283994/lg-oled-tv-sales-q1-2018-earnings
> 
> Some industry insiders think OLED will have 90% of the premium TV market within 3 years. In 2017, Samsung went completely edge lit which ended up being a disaster. They are in total scramble mode and they put all of their efforts into marketing and also mLED which is still 5+ years away.
> 
> I think anyone who buys a Samsung LCD over an OLED is clueless. Their top set (Q9FN) is EIGHTH place on Rtings picture quality chart. That's pretty low.


Its still 1st place from non OLED TVs.
To be honnest OLED have still a lot to go to be as reliable as other tech. It may looks a bit better, but burn in issues and collor shift as it ages is something to keep me away from OLEDs.


----------



## moonbogg

https://youtu.be/0CYPtBvE_9Q?t=242

Fast forward to 4:02. Look at the backlight bleed issues around the mouse cursor over a dark background. I thought these things were supposed to have SUPER EPIC CONTRAST! This thing sucks and I'd pass it up if the price was $300. Serious.


----------



## boredgunner

Excession said:


> They are perfectly capable of reaching brightnesses which are actively uncomfortable to look at. I'm not sure why they need to go much further unless the potential that you're talking about is for televisions that can do double-duty as floodlights.





MistaSparkul said:


> If I already can't stand the HDR brightness on my OLED TV and phone, then how is having even more brightness suppose to give me a better experience? Can you honestly tell me that watching say, youtube HDR on a phone like a Galaxy S9+ doesn't sear your eyeballs when viewing in a dark room?


Yup, people just love to obsess over arbitrary pointless numbers. Let's look at some real numbers instead, such as infinity:1 contrast ratios. Again there is no magic in this equation, 1000 lumens is blinding no matter what. HDR doesn't change that. Also as I said, nits is really becoming the selling point for the ignorant (because it is the only number LCD can win in besides price).



Scotty99 said:


> Local dimming simply needs to get better. I know what you guys are saying, even my entry level E series can get bright enough to be uncomfortable in a dark room but an advanced dimming system will be able to control and fine tune the display a lot better than whats currently out there on the market.


And it may not be able to feasibly improve to the point where it competes with OLED or microLED. It'd need one LED per pixel essentially, but that would be outrageous to manufacture (at that point OLED is cheaper and still better). The only other alternative I can see is some technology previously shown by... don't even remember who or exact details, but basically every pixel had something to modulate light, supposedly coming close to the performance of self-emitting pixels.



Sancus said:


> Samsung bailed on OLED because they couldn't make their OLEDs cost effective at TV sizes, and couldn't duplicate LG's architecture due to patents. No other reason. They've been playing catch-up in the TV market ever since.


They'd probably need to invest more into OLED production like LG did, but decided not to. 



xrodney said:


> Its still 1st place from non OLED TVs.
> To be honnest OLED have still a lot to go to be as reliable as other tech. It may looks a bit better, but burn in issues and collor shift as it ages is something to keep me away from OLEDs.


Not sure if color shift is still a problem for more recent models. Burn-in is, which is why I'd like to see adjustable strobing on them and see how much that remedies the issue (it'll also improve the heck out of motion clarity obviously).


----------



## kot0005

acer preorder for $1999 https://www.newegg.com/Product/Product.aspx?Item=N82E16824011229


----------



## CaliLife17

Put in a pre-order for the Acer one on Newegg, but hoping either the Acer or Asus one shows up on Amazon soon so I can cancel Newegg and order from Amazon. If I can avoid it I would like to avoid having to deal with Newegg support or return policy in case any issues come up. 

Also there seems to be no mention of the Tobi eye stuff on the newegg page, so hopefully they dropped that and are not including it. I didn’t want to spend extra for that since I wouldn’t use it. 

Curious to see if there ends up being any difference between Acer model and Asus model. I would think Nvidia wants these things to be as close as possible when it comes to the screen.


----------



## Kommando Kodiak

I put in a preorder too, however, I'd like to purchase it else where as well. Don't want to end up in a Finalmouse next batch release waiting game.


----------



## Kommando Kodiak

this is a good read https://www.blurbusters.com/4k-144hz-g-sync-hdr-gaming-monitor-finally-arriving-soon-at-a-price/ You can tell the author has a total -- is very excited for this monitor


----------



## Jbravo33

Preordered a couple. Looks like Linus was right on price a year ago with $1999. Where did $3k even come from? Lol


----------



## CallsignVega

Very interesting that ULMB is included. With the FALD backlight that could me some pretty impressive motion clarity.


----------



## gypsygib

Scotty99 said:


> Samsung is truly in charge here, we arent going to get affordable OLED tv sets until samsung starts mass producing them.


So much wrong in that statement. Sorry, Samsung dropped the ball in the premium TV department. They were in the lead, then lost it. No current LCD comes close to OLED for IQ. FALD helps but it's not enough. MicroLCD is promising but far off from mainstream. Also, wouldn't want to be the early adopter for microLCD given that if a few of the million lights break, burnout or are faulty, then your stuck with completely dead pixels. I would imagine it's hard to QC millions of individually lit pixels when it's not a chemical process.


----------



## MistaSparkul

Jbravo33 said:


> Preordered a couple. Looks like Linus was right on price a year ago with $1999. Where did $3k even come from? Lol


From EU pre order prices. Bah if this was 32 inches I would pull the trigger too. Just don't feel like going back down to a measely 27.


----------



## kot0005

apparently Acer and Asus have different panel specs:

https://www.displayspecifications.com/en/model/6291c96

https://www.displayspecifications.com/en/model/2e5b126f

x27
96%P3
no quantum dot
true 10 bit 
4ms
350nits native

Pg27uq
90%P3
quantum dot
8bit +FRC
5ms
400nits native


----------



## boredgunner

CallsignVega said:


> Very interesting that ULMB is included. With the FALD backlight that could me some pretty impressive motion clarity.


I'm going to wait for the 3440 x 1440 200 Hz VA versions (or just cave in on a 55" OLED), since I expect FALD to be useless on IPS thus you'd be stuck with the same 1000:1 contrast and IPS glow. While even if FALD is useless on the VA model, I'd still have at least 2500:1 static contrast.

Did you preorder the Acer?


----------



## CaliLife17

kot0005 said:


> apparently Acer and Asus have different panel specs:
> 
> https://www.displayspecifications.com/en/model/6291c96
> 
> https://www.displayspecifications.com/en/model/2e5b126f
> 
> x27
> 96%P3
> no quantum dot
> true 10 bit
> 4ms
> 350nits native
> 
> Pg27uq
> 90%P3
> quantum dot
> 8bit +FRC
> 5ms
> 400nits native



Acer's own website shows it has QD = https://www.acer.com/ac/en/GB/content/predator-series/predatorx27


Ever since Nvidia announced them at CES 2017, all the way up to the preview event they this past week, they have shown both monitors as being the same specs. I think that display site has bad info, and like always Newegg also has some bad info. I fully expect these monitors to be the exact same panel with the same specs.




boredgunner said:


> I'm going to wait for the 3440 x 1440 200 Hz VA versions (or just cave in on a 55" OLED), since I expect FALD to be useless on IPS thus you'd be stuck with the same 1000:1 contrast and IPS glow. While even if FALD is useless on the VA model, I'd still have at least 2500:1 static contrast.
> 
> Did you preorder the Acer?



I was also holding off for the 21:9 HDR screen, but I have a feeling that Nvidia put all their time on this 27" one for now, then will move over to the 21:9 monitor. I caved and pre-order this one, knowing I will probably just end up moving to the 21:9 monitor when it comes out later. My only issue with 65" C8 is max 60hz and no VRR/G-Sync. If it had that, I would be all in. I have a 77" C8 coming start of next month, so that might be enough to sway me to wait till 21:9


----------



## CallsignVega

kot0005 said:


> apparently Acer and Asus have different panel specs:
> 
> https://www.displayspecifications.com/en/model/6291c96
> 
> https://www.displayspecifications.com/en/model/2e5b126f
> 
> x27
> 96%P3
> no quantum dot
> true 10 bit
> 4ms
> 350nits native
> 
> Pg27uq
> 90%P3
> quantum dot
> 8bit +FRC
> 5ms
> 400nits native



Na that doesn't mean anything. They are the same panel.


----------



## sblantipodi

kot0005 said:


> apparently Acer and Asus have different panel specs:
> 
> https://www.displayspecifications.com/en/model/6291c96
> 
> https://www.displayspecifications.com/en/model/2e5b126f
> 
> x27
> 96%P3
> no quantum dot
> true 10 bit
> 4ms
> 350nits native
> 
> Pg27uq
> 90%P3
> quantum dot
> 8bit +FRC
> 5ms
> 400nits native


that site is simply wrong, I really doubt that they will have different panel


----------



## Kommando Kodiak

No the site is correct

" Quantum Dot screen coating then provides a boost to the colour gamut of the screen, allowing it to extend beyond the typical sRGB and achieve >90% DCI-P3 (125% sRGB coverage). An Asus rep has confirmed that the 1.07b colour depth is (not surprisingly) achieved through an 8-bit+FRC panel. Not that this will really make much real difference in practice for the target uses."

Source: http://www.tftcentral.co.uk/news_archive/39.htm#asus_pa27uq_update

and his sourcing for that post #39: https://rog.asus.com/forum/showthread.php?99007-PG27UQ-release-May/page4


----------



## bee144

Kommando Kodiak said:


> No the site is correct
> 
> " Quantum Dot screen coating then provides a boost to the colour gamut of the screen, allowing it to extend beyond the typical sRGB and achieve >90% DCI-P3 (125% sRGB coverage). An Asus rep has confirmed that the 1.07b colour depth is (not surprisingly) achieved through an 8-bit+FRC panel. Not that this will really make much real difference in practice for the target uses."
> 
> Source: http://www.tftcentral.co.uk/news_archive/39.htm#asus_pa27uq_update
> 
> and his sourcing for that post #39: https://rog.asus.com/forum/showthread.php?99007-PG27UQ-release-May/page4


No, the site in incorrect about the X27. It's the exact same panel in both models. The X27 has an 8-bit+FRC panel as well.


----------



## Kommando Kodiak

Acer is advertising it as 10 bit and as having higher Adobe RGB coverage. So wheres that differential coming from other than 10 bit? *edit* is it possible they have a better QDOT film????


----------



## bee144

Kommando Kodiak said:


> Acer is advertising it as 10 bit and as having higher Adobe RGB coverage. So wheres that differential coming from other than 10 bit? *edit* is it possible they have a better QDOT film????


ASUS is also advertising PG27UQ as 10-bit*.

* 8-bit+FRC. 

Acer has only officially revealed its monitor as having 99% Adobe RGB. They have not provided an official DCI-P3 %.

Asus has only officially revealed it's monitor as having 93% DCI-P3. They have not provided an official Adobe RGB %.

We need more info. All we know is that stupid site everyone keeps linking to knows nothing. It's not the monitor bible considering ASUS and Acer haven't officially released anything nor have they even released final specs.


----------



## sblantipodi

bee144 said:


> ASUS is also advertising PG27UQ as 10-bit*.
> 
> * 8-bit+FRC.
> 
> Acer has only officially revealed its monitor as having 99% Adobe RGB. They have not provided an official DCI-P3 %.
> 
> Asus has only officially revealed it's monitor as having 93% DCI-P3. They have not provided an official Adobe RGB %.
> 
> We need more info. All we know is that stupid site everyone keeps linking to knows nothing. It's not the monitor bible considering ASUS and Acer haven't officially released anything nor have they even released final specs.


it pretty strange, those specs seems so high end, why there is no hardware calibration?

how such a wide gamut is handled on non color managed software like games and many others?


----------



## bee144

sblantipodi said:


> bee144 said:
> 
> 
> 
> ASUS is also advertising PG27UQ as 10-bit*.
> 
> * 8-bit+FRC.
> 
> Acer has only officially revealed its monitor as having 99% Adobe RGB. They have not provided an official DCI-P3 %.
> 
> Asus has only officially revealed it's monitor as having 93% DCI-P3. They have not provided an official Adobe RGB %.
> 
> We need more info. All we know is that stupid site everyone keeps linking to knows nothing. It's not the monitor bible considering ASUS and Acer haven't officially released anything nor have they even released final specs.
> 
> 
> 
> it pretty strange, those specs seems so high end, why there is no hardware calibration?
> 
> how such a wide gamut is handled on non color managed software like games and many others?
Click to expand...

ASUS has said in their forums that the PG27UQ is factory calibrated and comes with a calibration certificate.


----------



## sblantipodi

bee144 said:


> ASUS has said in their forums that the PG27UQ is factory calibrated and comes with a calibration certificate.


and how this is related to what I have said?
I am talking about wide gamut and non color managed software, this can't be that good without proper hardware calibration.

leave alone the fact that a factory calibration lasts some months because a cyclic natural deviation is normal after some months of use.


----------



## bee144

sblantipodi said:


> and how this is related to what I have said?
> I am talking about wide gamut and non color managed software, this can't be that good without proper hardware calibration.
> 
> leave alone the fact that a factory calibration lasts some months because a cyclic natural deviation is normal after some months of use.


You said there was no hardware calibration and I was sharing with you that ASUS is hardware calibrating the monitor directly from the factory.
Whether or not a factory calibration is still valid after a few months was not stated in your original comment.


----------



## KGPrime

bee144 said:


> You said there was no hardware calibration and I was sharing with you that ASUS is hardware calibrating the monitor directly from the factory.
> Whether or not a factory calibration is still valid after a few months was not stated in your original comment.


Because you misunderstand what he means. Meaning if it can be calibrated directly to monitors internal LUT by the end user like high end professional monitors, Eizo, NEC ect. Else then the end user can only use an .icc profile which alters the LUT on your video card after the fact, which is _software_ calibration, they are not the same thing. So if you cannot hardware calibrate it, then you deal with .icc profiles which is undesirable, and can be problematic trying to get it to work in games ect. 

One would also hope they also have a standard sRGB setting, which i'm sure they must. It would be unbelievable if they didn't.


----------



## MistaSparkul

KGPrime said:


> Because you misunderstand what he means. Meaning if it can be calibrated directly to monitors internal LUT by the end user like high end professional monitors, Eizo, NEC ect. Else then the end user can only use an .icc profile which alters the LUT on your video card after the fact, which is _software_ calibration, they are not the same thing. If you cannot hardware calibrate it, then you deal with .icc profiles which is undesirable, and can be problematic trying to get it to work in games ect.
> 
> One would also hope they also have a standard sRGB setting, which i'm sure they must. It would be unbelievable if they didn't.


Given that the monitor is advertised as a gaming monitor, I am doubtful it has hardware LUT feature. Would be very useful though since you cannot use ICC profiles on a console so anybody looking to hook up an Xbox X can get calibrated colors.


----------



## bee144

KGPrime said:


> bee144 said:
> 
> 
> 
> You said there was no hardware calibration and I was sharing with you that ASUS is hardware calibrating the monitor directly from the factory.
> Whether or not a factory calibration is still valid after a few months was not stated in your original comment.
> 
> 
> 
> Because you misunderstand what he means. Meaning if it can be calibrated directly to monitors internal LUT by the end user like high end professional monitors, Eizo, NEC ect. Else then the end user can only use an .icc profile which alters the LUT on your video card after the fact, which is _software_ calibration, they are not the same thing. So if you cannot hardware calibrate it, then you deal with .icc profiles which is undesirable, and can be problematic trying to get it to work in games ect.
> 
> One would also hope they also have a standard sRGB setting, which i'm sure they must. It would be unbelievable if they didn't.
Click to expand...

These are gaming monitors. I wouldn’t anticipate it. Marshal from ASUS said on the ROG forums every time someone compared the PG27UQ to a graphic design monitor that the PG was clearly a gaming monitor. ASUS as least isn’t trying to make their monitor a gaming monitor AND a graphic artist display.


----------



## KGPrime

bee144 said:


> These are gaming monitors. I wouldn’t anticipate it. Marshal from ASUS said on the ROG forums every time someone compared the PG27UQ to a graphic design monitor that the PG was clearly a gaming monitor. ASUS as least isn’t trying to make their monitor a gaming monitor AND a graphic artist display.



I don't count on it. I knew all along they likely wouldn't. My response was only to your response where you misunderstood what *sblantipodi *was saying. But i agree with him, that since they went through all this trouble to create the ultimate gaming monitor, they should have actually add hardware calibration, because it's been a fairly huge issue forever, one that has caused people to write programs that try to force .icc profiles and in turn cause all kind of havok. It's good that it will be calibrated from the factory though, that and that lcds don't really need to be calibrated as often, they don't really drift as much as crts once did.

But we surely can expect to see posts and threads in the future about this moving forward eventually as it has been in the past. They should just add it in and charge the extra 500 dollars or whatever, and actually make the "ultimate" monitor. But they have to segregate the market so they can sell their high end color monitors to creative professionals, else who would buy those monitors instead of one of these. I wouldn't, even digital painting at 144Hz is noticeably better than at 60hz.


----------



## CallsignVega

Plasma Copper + Armor Titanium, LOL

https://www.newegg.com/Product/Product.aspx?Item=N82E16824236885


----------



## Scotty99

CallsignVega said:


> Plasma Copper + Armor Titanium, LOL
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16824236885


Don't forget HDR 20,000:1 and eye care 300. Much terms, very spec.


----------



## bee144

CallsignVega said:


> Plasma Copper + Armor Titanium, LOL
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16824236885


Was this in stock when you posted it 3.5 hours ago?


----------



## animeowns

bee144 said:


> Was this in stock when you posted it 3.5 hours ago?


the page just was created tonight if this is anything like the predator posting it should go live between midnight and 2am that was how I pre ordered my predator X27 around that time. Its interesting it says the acer panel is full 10 bit while the asus one is only 8bit+ frc


----------



## l88bastar

CallsignVega said:


> Plasma Copper + Armor Titanium, LOL
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16824236885


----------



## degenn

Good these are finally coming out but that price is bonkers imo & it's gonna take a 32" variant to entice me. C'mon Acer/Asus.... make it happen!


----------



## moonbogg

All you plebs making fun of this monitor are just jelly 'cause you can't afford one. That's all this is about. Me on the other hand, I can EASILY afford one and I'd buy one too, but I needed to use my money for a new coffee table. Like, all my money for a new coffee table. Just get rich and buy the monitor already. Jesus, people.


----------



## l88bastar

moonbogg said:


> All you plebs making fun of this monitor are just jelly 'cause you can't afford one. That's all this is about. Me on the other hand, I can EASILY afford one and I'd buy one too, but I needed to use my money for a new coffee table. Like, all my money for a new coffee table. Just get rich and buy the monitor already. Jesus, people.


----------



## Clukos

moonbogg said:


> All you plebs making fun of this monitor are just jelly 'cause you can't afford one. That's all this is about. Me on the other hand, I can EASILY afford one and I'd buy one too, but I needed to use my money for a new coffee table. Like, all my money for a new coffee table. Just get rich and buy the monitor already. Jesus, people.


You think you are special, huh!? I'll buy enough of those monitors to have a desk made out of them


----------



## Barefooter

moonbogg said:


> All you plebs making fun of this monitor are just jelly 'cause you can't afford one. That's all this is about. Me on the other hand, I can EASILY afford one and I'd buy one too, but I needed to use my money for a new coffee table. Like, all my money for a new coffee table. Just get rich and buy the monitor already. Jesus, people.


Can I borrow your table... and you don't mind if I return a shorter version of it do you


----------



## Glerox

So now that it seems that both will be the same price, Acer or Asus?


----------



## boredgunner

Glerox said:


> So now that it seems that both will be the same price, Acer or Asus?


Whichever one has more defects. Other than that, they should be the same. It's the XB271HU vs PG279Q all over again: with this competition, the XB271HU definitely had a better launch.


----------



## kot0005

well.. was posted on rog forums https://www.computerbase.de/2018-05/auo-mini-led-uhd-gaming-monitor/?amp=1

TADA : https://www.ple.com.au/Products/632...27-4K-G-SYNC-144Hz-4MS-IPS-LED-Gaming-Monitor

$4499 AU, roughly $3404US LOL

only $1404 more expensive.


----------



## VelocityMicroVA

The PG27UQ is live on our site now for order. I'm happy to answer any questions.

http://www.velocitymicro.com/asus-rog-swift-pg27uq.php


----------



## animeowns

Asus PG27UQ pre order is live now via newegg.com 

https://www.newegg.com/Product/Prod...6885&cm_re=asus_pg27uq-_-24-236-885-_-Product


----------



## Exilon

kot0005 said:


> well.. was posted on rog forums https://www.computerbase.de/2018-05/auo-mini-led-uhd-gaming-monitor/?amp=1


lol. This monitor is going to be obsolete within a year.


----------



## animeowns

Exilon said:


> lol. This monitor is going to be obsolete within a year.


that's still 4k unless we are going to see 5k and 8k versions with 100+hz refresh rate these monitors will still be relevant within a year from now.


----------



## Scotty99

animeowns said:


> that's still 4k unless we are going to see 5k and 8k versions with 100+hz refresh rate these monitors will still be relevant within a year from now.


Did you read the article lol? Purported 1000 dimming zones and micro led.

Local dimming is going to take off big in the next few years because of HDR, i would pass on this first gen monitor if i was anyone browsing this thread.


----------



## Glerox

Scotty99 said:


> Did you read the article lol? Purported 1000 dimming zones and micro led.
> 
> Local dimming is going to take off big in the next few years because of HDR, i would pass on this first gen monitor if i was anyone browsing this thread.


Have you seen how long it takes to develop monitors? It will take at least 2 years before we have a microled consumer model available.

And when it releases there will be something newer and better coming so you'll never buy something lol.


----------



## moonbogg

Glerox said:


> Have you seen how long it takes to develop monitors? It will take at least 2 years before we have a microled consumer model available.
> 
> And when it releases there will be something newer and better coming so you'll never buy something lol.



Generally I agree with buying what you want or need for today rather than waiting forever. There are exceptions though, and this monitor is one of them. I think these panels are a first attempt at this technology and big enhancements are coming soon after. If you can afford these things without worrying about spending the money, then that's different, go ahead and enjoy. Most of us don't want to spend $2-3000 on a panel that basically amounts to a rough draft of this new backlight technology. In know TV's have had it, but this is borderline defective for PC use due to the halo effects. It will irritate the absolute hell out people. I can't wait to watch the absolute parade of RMA's in the future owners thread on this forum. Mark my words. There will be a parade of returns due to the price of this thing. Any issue, either real or imagined, will result in an instant multiple RMA's. One after another after another. I can't wait.


----------



## Ferreal

Lol, I'm not waiting anymore. Preordered.


----------



## animeowns

Ferreal said:


> Lol, I'm not waiting anymore. Preordered.


if you play the waiting game there will always be something new on the way by the time those displays even reach market we might be talking about something completely different lol ya preordered X27 its 4k 144hz I will live with that.


----------



## Ferreal

animeowns said:


> if you play the waiting game there will always be something new on the way by the time those displays even reach market we might be talking about something completely different lol ya preordered X27 its 4k 144hz I will live with that.


I purchased the Asus PG278Q when it first came out, I'm still using that monitor for FPS games. I've tried monitors that came out after it, nothing can beat it, IMO. This is the replacement I've been waiting for. 4k HDR 144hz.


----------



## animeowns

Ferreal said:


> I purchased the Asus PG278Q when it first came out. I'm still using that monitor for FPS games. I've tried monitors that came out after it, nothing can beat it, IMO. This is the replacement I've been waiting for. 4k HDR 144hz.


I hear ya the only way Im upgrading from this 4k 144hz if we get 5k 144hz or 8k 144hz at similar price


----------



## moonbogg

Ferreal said:


> Lol, I'm not waiting anymore. Preordered.


How long before you get it? I want to know if its good or if it sucks, because If I lose my mind over these it will be the ultra wide version that I buy. When do you get this thing?


----------



## Ferreal

moonbogg said:


> How long before you get it? I want to know if its good or if it sucks, because If I lose my mind over these it will be the ultra wide version that I buy. When do you get this thing?


The release date is 6/22/18. Hopefully one month from now I'll have it in my possession.


----------



## CallsignVega

If the dates hold true, looks like the Acer will beat the ASUS to market by about three weeks.


----------



## Lass3

animeowns said:


> that's still 4k unless we are going to see 5k and 8k versions with 100+hz refresh rate these monitors will still be relevant within a year from now.



Relevant how? It's LCD.

I'd never pay $2000 for a 27 inch LCD monitor. It's a complete ripoff.

I paid almost the same for my 65 inch 4K/UHD OLED w. HDR and 120 Hz native ... PC monitor market is really a joke. We get overpriced garbage panels / obsolete tech. Micro LED where are you... Meanwhile I'll be at 1440p/165Hz AHVA. HDR is a joke on LCD anyway, even with FALD. HDR looks stunning on OLED/MLED (Yeah, I've seen The Wall).


----------



## Ferreal

CallsignVega said:


> If the dates hold true, looks like the Acer will beat the ASUS to market by about three weeks.


There are websites reporting that the Acer is releasing on June 1st. I think the stand alone is worth the wait imo.


----------



## Ferreal

Lass3 said:


> Relevant how? It's LCD.
> 
> I'd never pay $2000 for a 27 inch LCD monitor. It's a complete ripoff.
> 
> I paid almost the same for my 65 inch 4K/UHD OLED w. HDR and 120 Hz native ... PC monitor market is really a joke. We get overpriced garbage panels / obsolete tech. Micro LED where are you... Meanwhile I'll be at 1440p/165Hz AHVA. HDR is a joke on LCD anyway, even with FALD. HDR looks stunning on OLED/MLED (Yeah, I've seen The Wall).



That's 120hz @ 1080p. There is no display 4k 144hz, that's why we are paying the premium. We'll see how it looks, I'll be comparing it to my 65" Sony OLED.


----------



## animeowns

if the dates are correct I should have my 4k 144hz by 6/2/2018 I picked next business day delivery.


----------



## NewType88

Ferreal said:


> There are websites reporting that the Acer is releasing on June 1st. I think the stand alone is worth the wait imo.


I saw on the Newegg page that it’s June 1st or the 2nd before the preorders sold out. Now it just says out of stock. 

I guess amazon will get them to show up when they officially release ?


----------



## Shardnax

moonbogg said:


> Generally I agree with buying what you want or need for today rather than waiting forever. There are exceptions though, and this monitor is one of them. I think these panels are a first attempt at this technology and big enhancements are coming soon after. If you can afford these things without worrying about spending the money, then that's different, go ahead and enjoy. Most of us don't want to spend $2-3000 on a panel that basically amounts to a rough draft of this new backlight technology. In know TV's have had it, but this is borderline defective for PC use due to the halo effects. It will irritate the absolute hell out people. I can't wait to watch the absolute* parade of RMA's in the future owners thread on this forum. * Mark my words. There will be a parade of returns due to the price of this thing. Any issue, either real or imagined, will result in an instant multiple RMA's. One after another after another. I can't wait.


I'll be surprised if that doesn't happen.

At people that have pre-ordered this, per Blur Busters:
"I have reconfirmed, and for 4:4:4 10-bit HDR your refresh rate does fall to 98 Hz.

No word of yet if there’s a 4:4:4 8-bit non-HDR mode (e.g. for ULMB mode) to keep 4K 120Hz or 144Hz — but it would be interesting to find out if there are other bit modes as a tradeoff."


----------



## Glerox

I preordered it on amazon.ca for 2570$ CAD.
Now it's 2799$ CAD.

Once it ships, do you guys know if they"ll charge me the original price or the new price?


----------



## Lass3

Ferreal said:


> That's 120hz @ 1080p. There is no display 4k 144hz, that's why we are paying the premium. We'll see how it looks, I'll be comparing it to my 65" Sony OLED.


It does support 2160p/120Hz too, HFR, 2018 model. Just not from external sources, yet. But this means that the panel can handle it.

I simply don't see the point of 2160p on a 27 inch. Last time I tried that combo, I needed alot of scaling even with 20/20 vision. Should have been 32 inch, that size works much better for this res.


----------



## Scotty99

Can anyone else not see new posts to this thread ?


----------



## kx11

VelocityMicroVA said:


> The PG27UQ is live on our site now for order. I'm happy to answer any questions.
> 
> http://www.velocitymicro.com/asus-rog-swift-pg27uq.php



i ordered 1 , when will i receive ?!


----------



## kx11

overclock.net is broken again ?!


----------



## Agent-A01

Scotty99 said:


> We can agree to disagree on that one, a 4ms ips is going to feel slower (and just "off") to me vs a 4ms tn if one existed. What funny is ive heard people say similar things about OLED, even tho they know its faster something felt off. Not sure what you are getting at with the 1440p ips thing, these pro OW players have more money than all of us lol. (streamers make millions per year, yet they are still rocking TN monitors).
> 
> But annnyywwaayys, i cant wait to see rtrings get ahold of this monitor to test its fald vs the top fald TV's.


I don't think you know the difference between IPS and IPS-Type.

TN and IPS type like my monitor have very similar color transitions ms.

TN is marginally better in most cases while being a few ms better in color shifts that are on opposite ends of the spectrum.

True IPS is totally different and is nowhere comparable to IPS type or TN.


----------



## CallsignVega

This thread not updating or something?


----------



## Glerox

this thread is broken. cant access to latest updates.


----------



## kx11

VelocityMicroVA said:


> The PG27UQ is live on our site now for order. I'm happy to answer any questions.
> 
> http://www.velocitymicro.com/asus-rog-swift-pg27uq.php



ordered 1 from you guys


ETA ?


----------



## kx11

if you want to see update


from your CP look for this thread and under it there's a* reply* , click it and scroll down !!


----------



## kot0005

kx11 said:


> if you want to see update
> 
> 
> from your CP look for this thread and under it there's a* reply* , click it and scroll down !!



NANI ?!?!


----------



## kx11

kot0005 said:


> NANI ?!?!



alright they fixed it :h34r-smi


----------



## sblantipodi

it arrived in Italy at 3000 euros.
LOL! Asus, you are completely stupid.


----------



## andre02

Glerox said:


> I preordered it on amazon.ca for 2570$ CAD.
> Now it's 2799$ CAD.
> 
> Once it ships, do you guys know if they"ll charge me the original price or the new price?


It should be the price you ordered it at, hopefully.


----------



## Monstieur

Ferreal said:


> I purchased the Asus PG278Q when it first came out, I'm still using that monitor for FPS games. I've tried monitors that came out after it, nothing can beat it, IMO. This is the replacement I've been waiting for. 4k HDR 144hz.


Same. No monitor has been able to outperform my PG278Q - it's even more colour accurate than IPS panels. I bought the PG258Q - 240 Hz is visibly smoother, but the pixel density is crap on 1080p 24" and the colours are also crap. I bought the PG27VQ because it has a HDMI port - it's the same performance wise but the colours are crap again. The PG279Q looks much better colour wise but the input lag and blurring on IPS is visibly inferior to the TN panel on the PG278Q.

These new 4K 144 Hz IPS panels will probably have the same input lag and blurring as the PG279Q. It may not be able to completely replace the PG278Q. The IPS models don't have ULMB and 3D Vision 2 either.


----------



## Clabby94

Monstieur said:


> Same. No monitor has been able to outperform my PG278Q - it's even more colour accurate than IPS panels. I bought the PG258Q - 240 Hz is visibly smoother, but the pixel density is crap on 1080p 24" and the colours are also crap. I bought the PG27VQ because it has a HDMI port - it's the same performance wise but the colours are crap again. The PG279Q looks much better colour wise but the input lag and blurring on IPS is visibly inferior to the TN panel on the PG278Q.
> 
> These new 4K 144 Hz IPS panels will probably have the same input lag and blurring as the PG279Q. It may not be able to completely replace the PG278Q. The IPS models don't have ULMB and 3D Vision 2 either.


Does the PG278Q still have the pixel inversion issue? I remember going through a few when they first came out. Ended up returning them all because of the odd artifacts when there was movement on the screen. Curious if a new revision cured it


----------



## Malinkadink

Monstieur said:


> Same. No monitor has been able to outperform my PG278Q - it's even more colour accurate than IPS panels. I bought the PG258Q - 240 Hz is visibly smoother, but the pixel density is crap on 1080p 24" and the colours are also crap. I bought the PG27VQ because it has a HDMI port - it's the same performance wise but the colours are crap again. The PG279Q looks much better colour wise but the input lag and blurring on IPS is visibly inferior to the TN panel on the PG278Q.
> 
> These new 4K 144 Hz IPS panels will probably have the same input lag and blurring as the PG279Q. It may not be able to completely replace the PG278Q. The IPS models don't have ULMB and 3D Vision 2 either.


The input lag of the IPS monitors is negligible and for all intents and purposes is or can be equal to TN. It's the pixel response thats weaker which is responsible for more blur. As for ULMB that's a useless feature to me personally as strobing means no variable refresh + flickering which doesn't help your eyes any. Also do you really care if your monitor has 3D Vision? 3D is dead. 


I'd really like a monitor at 32 inches 4k 144hz with Gsync/Freesync no HDR, no Local Dimming, just a barebones display thats some IPS variant for $1000 or less. When we're swimming in HDR content in 10 years then i'll want that spec in my display, but for now there is too much inconsistency with HDR being implemented right, or wrong.


----------



## Monstieur

Clabby94 said:


> Does the PG278Q still have the pixel inversion issue? I remember going through a few when they first came out. Ended up returning them all because of the odd artifacts when there was movement on the screen. Curious if a new revision cured it


I've owned the PG278Q, PG278QR, PG258, and PG27VQ. They all look exactly the same except for the factory colour calibration and gamma curve. When the PG258Q was brand new, it had some kind of alternate pixel flickering on certain images but it stopped in a few days. I bought all of these at launch so they're all likely the first revision. Every single one of them flickers on a grey checkerboard test pattern, but this is not present in any actual content.


----------



## Monstieur

Malinkadink said:


> As for ULMB that's a useless feature to me personally as strobing means no variable refresh + flickering which doesn't help your eyes any. Also do you really care if your monitor has 3D Vision? 3D is dead.


ULMB is useless to me, but I use 3D Vision all the time for movies and any game that supports it. Tomb Raider, GTA V, Max Payne 3, etc. all support it natively and it's glorious. On the PG258Q, the panel has a native brightness of 600 nits and it's a super bright 300 nits in 3D mode.


----------



## Ferreal

Monstieur said:


> Same. No monitor has been able to outperform my PG278Q - it's even more colour accurate than IPS panels. I bought the PG258Q - 240 Hz is visibly smoother, but the pixel density is crap on 1080p 24" and the colours are also crap. I bought the PG27VQ because it has a HDMI port - it's the same performance wise but the colours are crap again. The PG279Q looks much better colour wise but the input lag and blurring on IPS is visibly inferior to the TN panel on the PG278Q.
> 
> These new 4K 144 Hz IPS panels will probably have the same input lag and blurring as the PG279Q. It may not be able to completely replace the PG278Q. The IPS models don't have ULMB and 3D Vision 2 either.


You are right about blurring on an IPS panel. I tried the PG279Q and could not stand it so I had to send it back. I'm hoping the higher resolution and HDR on these new panels is enough for me to ignore the blurring.


----------



## Monstieur

Ferreal said:


> You are right about blurring on an IPS panel. I tried the PG279Q and could not stand it so I had to send it back. I'm hoping the higher resolution and HDR on these new panels is enough for me to ignore the blurring.


I'm conflicted between getting a single PG27UQ for everything, and sticking to a TN PG27VQ for response time and 3D Vision (because my PG278Q has dimmed and developed dead pixels over 4 years) and getting an LG OLED for HDR gaming / movies / PS4.


----------



## Glerox

First PG27UQ available pricing in Canada :

http://www.canadacomputers.com/search_results.php?search_in=&keywords=pg27uq

I pre-ordered the X27. I still hesitate between both. We have more details on the Asus model and it seems more polished over the X27.
However, I prefer how the X27 looks...

Tough choices... first world problems

Edit : (Considering I CAN'T wait for reviews)


----------



## boredgunner

Malinkadink said:


> As for ULMB that's a useless feature to me personally as strobing means no variable refresh + flickering which doesn't help your eyes any.


Who needs variable refresh rate when your refresh rate is maxed out and the FPS is the same or higher? That's when strobing becomes the best option. But obviously we lack the GPU horsepower for this at 4k.


----------



## Malinkadink

boredgunner said:


> Who needs variable refresh rate when your refresh rate is maxed out and the FPS is the same or higher? That's when strobing becomes the best option. But obviously we lack the GPU horsepower for this at 4k.


Can still cause eye fatigue/strain and it does add a frame of input lag.


----------



## MistaSparkul

boredgunner said:


> Who needs variable refresh rate when your refresh rate is maxed out and the FPS is the same or higher? That's when strobing becomes the best option. But obviously we lack the GPU horsepower for this at 4k.


Not even 4k. It can be impossible to maintain 120fps even at 1080p depending on the game. Try to never dip below 120fps in AC Origins or Far Cry 5. Good luck with that.


----------



## KGPrime

This would be a clear time for restraint, as after this there will be more hopefully decently executed options, non fald high refresh 4k, micro mini nano leds, and the great divider, 240Hz 1440p tn.


----------



## toncij

Ferreal said:


> You are right about blurring on an IPS panel. I tried the PG279Q and could not stand it so I had to send it back. I'm hoping the higher resolution and HDR on these new panels is enough for me to ignore the blurring.


I don't see any bluring. TFTCentral does neither, the difference is insignificant as you can see in the attached pic. The forum does not allow attachments so check here: https://imgur.com/a/WthLoog


I had both and can confirm TFTCentral's findings. Also, while color accuracy is extremely good for a TN (on par with ****ty IPSes), good horizontal angles, but vertical viewing angles are horrible and for all practical purposes, you can't use it for any text, web or whatever.


----------



## Monstieur

toncij said:


> I don't see any bluring. TFTCentral does neither, the difference is insignificant as you can see in the attached pic. The forum does not allow attachments so check here: https://imgur.com/a/WthLoog
> 
> 
> I had both and can confirm TFTCentral's findings. Also, while color accuracy is extremely good for a TN (on par with ****ty IPSes), good horizontal angles, but vertical viewing angles are horrible and for all practical purposes, you can't use it for any text, web or whatever.


The "Average Lag Comparison" chart is incomplete and only a half truth. The minimum response time of the PG279Q is 4x as long as the PG278Q and the average response time is almost 2x as long. See the attached image. If you went straight to the PG279Q it would seem very smooth. But if you're coming from the PG278Q you'd instantly see the lag.

If the PG279Q was even comparable to the PG278Q, it would have 3D Vision 2 certification like the latter. It doesn't because it's too blurry to display alternating frames for each eye that have completely different images that shouldn't interfere. There is not a single high refresh rate G-SYNC IPS panel with 3D Vision 2 certification, but every single G-SYNC TN panel has it.


----------



## Clukos

Into the trash it goes










The wait for non-IPS HDR PC monitors begins...


----------



## Monstieur

I expected blooming but not comet trails... I just ordered an LG B7T.

The PA32UC-K looks much worse though. This also has a 384 zone FALD but none of the G-SYNC HDR optimizations. I wonder if the local dimming implementation can differ between Acer and Asus.


----------



## MistaSparkul

The Acer thread is about to be flooded with blooming complaints in a few days with this one to follow in the coming weeks.


----------



## KGPrime

Le poop.


----------



## inedenimadam

Yikes. 



too bad for the guys that were waiting with wallet in hand.


----------



## Exilon

Monstieur said:


> The "Average Lag Comparison" chart is incomplete and only a half truth. The minimum response time of the PG279Q is 4x as long as the PG278Q and the average response time is almost 2x as long. See the attached image. If you went straight to the PG279Q it would seem very smooth. But if you're coming from the PG278Q you'd instantly see the lag.
> 
> If the PG279Q was even comparable to the PG278Q, it would have 3D Vision 2 certification like the latter. It doesn't because it's too blurry to display alternating frames for each eye that have completely different images that shouldn't interfere. There is not a single high refresh rate G-SYNC IPS panel with 3D Vision 2 certification, but every single G-SYNC TN panel has it.


This is also a half-truth. Blur from persistence is just as strong if not a stronger observed effect. 4x longer for a very small pixel response will not translate to 4x as much blur at 144Hz because persistence blur dominates at those ranges.










Yes, the IPS panel is worse at strobing, but it doesn't translate 1:1 to motion blur in non-strobing modes.



















The IPS panel has better color stability and better gamma stability between the top and bottom the screen due to viewing angles while the TN panel crushes the top and blows out the bottom. For those that don't care about those, then by all means go TN.


----------



## Monstieur

Exilon said:


> This is also a half-truth. Blur from persistence is just as strong if not a stronger observed effect. 4x longer for a very small pixel response will not translate to 4x as much blur at 144Hz because persistence blur dominates at those ranges.
> 
> The IPS panel has better color stability and better gamma stability between the top and bottom the screen due to viewing angles while the TN panel crushes the top and blows out the bottom. For those that don't care about those, then by all means go TN.


Both monitors have 100% persistence with ULMB off, and I prefer the blurred look of ULMB off even on TN. ULMB makes the movement too discrete and hurts my eyes.

What I don't like about IPS is not the blur itself, but the colour darkening / trails that occur when you pan the camera in-game at just the right speed. Multiple colours are being smeared together because the image has moved before the last pixel transition has even completed. This does not happen on TN or OLED even with 100% persistence. I also know this effect will become invisible to your eyes in a few days. I had a Dell U2713H which literally inverted colours to their negative when scrolling due to poorly implemented overdrive - the entire browser window would turn from white to black. I could not see this extreme defect after a few days of using it!


----------



## Seyumi

Clukos said:


> https://www.youtube.com/watch?v=8ZCGRjb3fQc
> 
> Into the trash it goes
> 
> 
> The wait for non-IPS HDR PC monitors begins...


Yikes that's unacceptable but expected with all the FALD previews as of late. I play a lot of dark games so these monitors are definitely off my wish-list. Oh well at least I saved an arm & a kidney.


----------



## l88bastar

Clukos said:


> https://www.youtube.com/watch?v=8ZCGRjb3fQc
> 
> Into the trash it goes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The wait for non-IPS HDR PC monitors begins...





Looking at an IPS with FALD from the sides accentuates the bloom. You gotta look at an IPS FALD head on.


----------



## Clukos

l88bastar said:


> Looking at an IPS with FALD from the sides accentuates the bloom. You gotta look at an IPS FALD head on.


Yeah I guess that's true, I'm more worried about the FALD lag in that video tbh.


----------



## l88bastar

Clukos said:


> Yeah I guess that's true, I'm more worried about the FALD lag in that video tbh.


----------



## Aristotelian

I'm not saying that haloing or whatever is acceptable, but the very video being linked in this thread which apparently makes this monitor garbage states this in its description:

"Please note that the halo-effect seen in this video is (almost) not visible when the monitor is viewed directly head-on." They filmed it at an angle to amplify the effect.

Either way, I have no horse in this race - seems there's a whole movement out there trying to trash this monitor before it is even thoroughly reviewed, all under the guise of "too small and too expensive anyways".


----------



## profundido

Monstieur said:


> The "Average Lag Comparison" chart is incomplete and only a half truth. The minimum response time of the PG279Q is 4x as long as the PG278Q and the average response time is almost 2x as long. See the attached image. If you went straight to the PG279Q it would seem very smooth. But if you're coming from the PG278Q you'd instantly see the lag.
> 
> If the PG279Q was even comparable to the PG278Q, it would have 3D Vision 2 certification like the latter. It doesn't because it's too blurry to display alternating frames for each eye that have completely different images that shouldn't interfere. There is not a single high refresh rate G-SYNC IPS panel with 3D Vision 2 certification, but every single G-SYNC TN panel has it.



I own the Asus PG27AQ as well as the Viewsonic version of the PG279Q in 1 location where I live during the week and in the weekend I come home in another place where I have the PG278Q. I have no real delay complaints during the week on both IPS/VA monitors and find everything workable/playable but I swear every single time I launch any game in the weekend on the PG278Q I get this "WOW" feeling related to the responsiveness. 

Somehow that monitor feels as if it's one with my hand on the mouse, my arm and my eyes because every action my brain decides upon (strafing, tilting, turning) immediately translates in my eyes seeing the result of that command from the brain. ZERO DELAY in my subjective feeling, totally fluent. Whenever I go back to my other monitors I feel the delay for the first 5 minutes and then I stop paying attention to it.

There is a very real and feelable difference. I'm not savy enough to know what it is about the screen (1ms vs 4ms input lag, pixel response, electronics processing time, ...) but the fact is that it's the screen for sure as the pc is equal in both places


----------



## profundido

Aristotelian said:


> I'm not saying that haloing or whatever is acceptable, but the very video being linked in this thread which apparently makes this monitor garbage states this in its description:
> 
> "Please note that the halo-effect seen in this video is (almost) not visible when the monitor is viewed directly head-on." They filmed it at an angle to amplify the effect.
> 
> Either way, I have no horse in this race - seems there's a whole movement out there trying to trash this monitor before it is even thoroughly reviewed, all under the guise of "too small and too expensive anyways".


totally agreed. Things are being pulled out of context and proportion in some sort of blind witch hunt with the only goal of burning these brand new pieces of technology to the stake. I'm not a fan of these 2 new monitors yet but await proper reviews with great interest and if they turn out to be good I might actually just get myself one. Apart from all the HDR hype I'm mainly interested in the combination of [email protected] Right now have both features in front of me in the form 2 different separate monitors. 1 can do [email protected] and the other one [email protected] and I want those 2 gorgeous features together in 1 display lol


----------



## Glerox

The PG27UQ is now also available for pre-order on newegg canada. It is 200$ less expansive than the Predator X27 in Canada so it's a no brainer between the two unless you really prefer the Acer.


----------



## toncij

Monstieur said:


> The "Average Lag Comparison" chart is incomplete and only a half truth. The minimum response time of the PG279Q is 4x as long as the PG278Q and the average response time is almost 2x as long. See the attached image. If you went straight to the PG279Q it would seem very smooth. But if you're coming from the PG278Q you'd instantly see the lag.


I don't see in what numbers there you see 4x difference? It's 17% average or 47% worst, and that's not even close to 400%.



Seyumi said:


> Yikes that's unacceptable but expected with all the FALD previews as of late. I play a lot of dark games so these monitors are definitely off my wish-list. Oh well at least I saved an arm & a kidney.


An extreme angle nobody will ever look at it at.


----------



## KGPrime

Aristotelian said:


> I'm not saying that haloing or whatever is acceptable, but the very video being linked in this thread which apparently makes this monitor garbage states this in its description:
> 
> "Please note that the halo-effect seen in this video is (almost) not visible when the monitor is viewed directly head-on." They filmed it at an angle to amplify the effect.
> 
> Either way, I have no horse in this race - seems there's a whole movement out there trying to trash this monitor before it is even thoroughly reviewed, all under the guise of "too small and too expensive anyways".



Sure. But there are things we can already deduce from things we have thought of a long time ago when it was announced and people have already mentioned. Now seeing it in and the other releases like it with fald, all the various videos not just this one, we can see they are likely true.

While it certainly looks better in real life and from head on, the haloing particularly in this video is enhanced buy ips glow. Meaning at it's core it is just like any other ips panel. Which suck to varying degrees.

Fald will help reduce ips glow across the entire screen, during hdr content, but on desktop it will just be like any regular ips. 

If you could or can use fald on desktop which is almost certainly not recommended as it would be terrible in certain scenarios, like if your desktop and entire Windows UI is dark, or black for instance. The glowing from desktop icons or your mouse cursor ect would be even more annoying particularly off angle like things in the corner of the screen..or like in games a light source off the the corner of the screen will have more glowing halo as it is at more of an angle to your eye, just like corner glow of current ips.

You can of course as always push it back farther to alleviate it somewhat. But then, it's 4k at 27" already, you have to be able to read on it or possibly use unrealistic scaling. I personally like my monitor close. 

So at best it is a regular ips monitor when not gaming and a 2k + dollar one a that. I have no problem paying 2500 dollars for a top of the line monitor, that's how much some top of the line monitors have costed forever. But if it's an lcd and an ips at that, it has to be better than what this is 100% of the time. Not just a wow factor for hdr content in gaming, which i care 0 about at this time. 

For those interesting in all that there's not a lot of hdr content yet, but there surely will be and it might be worth it to them right now to play Final Fantasy on or whatever. But i just don't care about the hdr part right yet, nor the 4k part either. 

The 144Hz part i have, the low glow i have, the gsync i have, the sufficiently hi dpi i have and it cost 400 dollars, the color isn't as good, but for 400 bucks it's more than tolerable and that's my reasoning.



So it's main thing really is just 4k at 144Hz with the option of Hdr, and high color, all of which are only usable_ some of the time_.

As such, it doesn't offer much for the mney. It's a gaming monitor, and marketed as such sure. But a single monitor just for gaming at that cost, well, there's just not enough games worth spending 2k plus 60 dollars to play imo.


We could also save Tft Central the trouble. 

The pixel response time and input will be at best the same as the previous 144Hz ips panels, or worse with fald on, perhaps like an Acer x34 21:9 ect. Which, really for what it is, isn't terrible. I'm also interested on how Prad or Tft Central is preparing to test these things.

It will have less backlight bleed and presumable better uniformity. Which is great, and one of the things i actually DO care about, but ips glow will remain more or less under various conditions.

I've been reading tft central reviews since around the time when they started, 12 or 13 years or whatever. Anymore i scroll go straight to the backlight bleed and pixel response times charts as everything else is basically the same review year after year after year, and generally, so are those two things for that matter. I particularly also enjoy pcmonitor.info legendary videos on glow and his subjective impressions as well.

The color range is beyond most peoples needs and most would want to run it in sRGB mode 90% of the time, unless again it's sole purpose is just for playing hdr games on. 

On the subject of Fald. It would have been a good idea when lcds first came out 15 years ago. Today it almost seems archaic and rudimentary. To expensive to implement and not worth the payoff. Ideally there would be about 4x as many zones. Or 1 led per pixel for that matter which is unrealistic. As such they should be on to using better ideas by now, but they have been slagging off for over a decade until more recently and they are finally trying to do different things with mini leds and quantum dots ect and all that.

I also believe that it's very possible that with all the time and delays and money and r&d put into this thing. It will be a one time release. A novelty. Just like i said about the 3k dollar Dell oled that was announced, then became vapor for years then all of a sudden was for sale and then was forgotten and no ones heard a single thing about anything like it or plans for a new version or anything similar again. We are stuck with backlit displays for desktop for the foreseeable future. Oled is not coming to desktop for the masses any time soon if ever. 

Also they likely have to sell an ass load of these to recoup all that money, at 2k+ a pop, that's a huge gamble. So they very likely could be a novelty. And in the near future after these are released there will be non fald cheaper options and newer tech panels will be coming. 

Simply put. I wouldn't pay 1000 dollars for it, let alone 2k+ dollars, because it's not really worth it (to me) for what it offers, it be a pain as a single monitor switching back and forth to different settings ect so i don't see it as a great candidate for an all rounder type monitor. 
I'd rather buy an 800 dollar pg279q or the Acer equivalent to be honest and even then i chose not to buy one of those either.

I however reserve the right to be a total unappologetic hypocrite and buy one if they somehow pull off a miracle.  I just don't see it happening. I absolutely hope those that buy them get their monies worth of enjoyment out of them though.


----------



## JackCY

boredgunner said:


> Who needs variable refresh rate when your refresh rate is maxed out and the FPS is the same or higher? That's when strobing becomes the best option. But obviously we lack the GPU horsepower for this at 4k.


Nor CPU performance even for some older games because game engines often CPU bottleneck around 100-200fps 



Clukos said:


> https://www.youtube.com/watch?v=8ZCGRjb3fQc
> 
> Into the trash it goes
> 
> 
> The wait for non-IPS HDR PC monitors begins...


Hard to tell anything from a video that is low fps, shot at an angle with light reflections on monitor and a game/scene that doesn't even have any big pure black areas so that the backlight can actually turn off.

Most of the time LD on monitors lags and IMHO this is caused by probably processing the backlight control separately using some cheap low performance processor, so while they can run <5ms input lag on panel the situation for backlight is more around 50-150ms input lag... often coupled with a fade in/out, and that's maybe the problem even if they have minimum lag they may have added 100ms fade in/out and that is too long for multimedia.

Doesn't have anything to do with IPS, IPS is still the king of LCDs, except manufacturers often bork it in ways that are not the fault of IPS technology but cost cutting, low QC, poor backlight, poor electronics, ...

Take an IPS 144Hz with ATW polarizer and proper local dimming made of direct LEDs... good luck telling it apart from OLED/VA TVs even at an angle. LEDs have been around for ages, but endless cost cutting have always made it cheaper to use a simple LED strip with a diffuser than to pay more for direct LEDs that still need at least some diffuser. But this mini, micro, nano LED? Come on manufacturers, that's been around for ages.


----------



## bee144

Linus gives a hands on impression^

He said G-Sync HDR only works with newer 1080 Ti cards and the Titan V. Does that mean if you have an older 1080 Ti, Titan X (Pascal), Titan Xp, or lower, you won't be able to fully use the monitor?


----------



## CaliLife17

He didn’t say it wouldn’t work, he mentioned that older 1080 Ti cards will need to get firmware flashed via a tool to be released by Nvidia. Titan XP = 1080 Ti = Titan Xp in regards to architecture. Probably just need firmware to either update DP or HDMI controller firmware on the card to work with newer g-sync module.


----------



## toncij

My only hurdle is the size. 27"... can't integer scale to anything meaningful.


----------



## Tonza

Costs only 2499 euros in here Finland, absolutely ridiculous price for such tiny monitor. This better have god like quality control without any panel defects on any sample (highly doubt it tho).


----------



## JackCY

toncij said:


> My only hurdle is the size. 27"... can't integer scale to anything meaningful.


1.0x works.



Tonza said:


> Costs only 2499 euros in here Finland, absolutely ridiculous price for such tiny monitor. This better have god like quality control without any panel defects on any sample (highly doubt it tho).


Just get a 4k OLED TV at these prices and a TN strobed 144-240-...Hz monitor 1080p/1440p.

If these 27" 4k 120Hz monitors were OLED, one could understand their price but considering they are IPS and probably even without the cheap ATW polarizer to remove glow? A massive rip off.


----------



## Lord_Bender

MistaSparkul said:


> The Acer thread is about to be flooded with blooming complaints in a few days with this one to follow in the coming weeks.


Not even the worse part, Acer didn't get HDR certification due to a firmware issue. All those ones that sold are not HDR certified, and i don't see how they can change over unless you take to a service center


----------



## HyperMatrix

bee144 said:


> https://youtu.be/u-7IXNLeY3c
> 
> Linus gives a hands on impression^
> 
> He said G-Sync HDR only works with newer 1080 Ti cards and the Titan V. Does that mean if you have an older 1080 Ti, Titan X (Pascal), Titan Xp, or lower, you won't be able to fully use the monitor?





bee144 said:


> https://youtu.be/u-7IXNLeY3c
> 
> Linus gives a hands on impression^
> 
> He said G-Sync HDR only works with newer 1080 Ti cards and the Titan V. Does that mean if you have an older 1080 Ti, Titan X (Pascal), Titan Xp, or lower, you won't be able to fully use the monitor?


Linus is an incredibly annoying tool. And he's a marketer. Not a reviewer. Don't look for real honest answers from him. He really isn't well informed when it comes to computer tech. It's all prepared speeches.

On a side note, regarding all the hate on the FALD system and blaming it on this being an IPS display, that's all hogwash. The FALD system is very responsive. But it suffers from 2 things:

1) Low resolution. a 384 zone grid is really quite low. That means each LED is responsible for a 2cm x 2cm square. So if there happens to be a bright object as well as dark object in that 2cm square, the FALD system has no option but to light up, making the blacks around it less black. This is actually a very big limitation and that's why AU Optronics is already moving on to Mini LED grids for future monitors, allowing an increase of 3-4x the current FALD resolution, which would result in each zone going from being a 2cm x 2cm square, to being just a 0.5-0.7cm x 0.5 x 0.7cm. That will be a HUGE improvement. But still not enough to allow OLED level contrast. And that's why they already have Micro LED grids on their roadmap, which will allow the zone size to go down to approximately 0.1 to 0.15cm. So basically:

Current generation FALD: 400 mm2
Mini LED FALD: 25-50 mm2
Micro LED FALD: 1-3 mm2 <--- This is where we get OLED-ish quality IPS contrast

2) Anti-reflective coating. This is the real culprit. Anti reflective coatings on monitors work by diffusing the light that hits the monitor, to prevent an overly bright spot that outshines the brightness of the monitor, creating a reflection. The problem is, that the same thing happens in reverse. So when a bright light is emitted from the monitor, it also gets caught up in the light diffusion layer. Which causes it to spread out. This won't be as notable when viewing it head on. But when viewed from any other angle, you're increasing your exposure to the coating, while reducing your exposure to the IPS panel itself. This makes the light diffusion far more visible. 

Now I'm not sure about what other coatings they are using on these displays. With Samsung TVs they use what I would best describe as a sort of tint layer over top of the panel, which prevents low level lights from shining through, allowing for the appearance of very deep and rich blacks. Samsung also uses glossy panel. Because if you have a panel that is bright enough, a glossy tinted black display is going to be your best option. One way to reduce the perceived FALD glow on these monitors, will be to remove the anti-glare/anti-reflective coating. I'm sure guides will come out for that later.

As for all the people wishing this were OLED, there's actually a very good reason why it's not. I'm about to ruffle some feathers here. But this needs to be said. OLED is inferior technology. It has limited room for growth. It's like Plasma. It was really nice. But had limitations. And couldn't evolve. And it will die out. OLED as a gaming display is actually a terrible idea. I can sense the keyboard warriors getting amped up and ready to hit that reply button to cuss me out. But I will soldier on. There are a few problems with OLED that can't be resolved:

1) Brightness. Because each pixel creates its own light, it's limited in terms of how much light it can output. So while it can maintain a perfect contrast ratio, its maximum brightness will be limited. And yes...I'm going to get a lot of would-be videophiles saying that you don't really need that much maximum brightness because the brightness of OLED is more than sufficient in a light controlled environment. Which is technically true. If you're going to set up a theater room where it's pitch black, then yes. OLED will truly shine. And because the black pixels will literally be black, and your pupils are dilated due to the near pitch black room, the current level of OLED brightness can be bright enough to burn your retina. Fun times. For anyone who's not playing in a pitch black room, however, this is less than ideal. MicroLED is the future. Not OLED. Whether we're talking about a full MicroLED panel, or simply a MicroLED backlight system behind a standard IPS type display, both can be superior to OLED. High resolution MicroLED FALD behind a high resolution IPS, behind a Quantum Dot layer, behind additional tint/blackening layers, and you can end up with a TV that has the exact same perceivable contrast level as OLED, with colors that are just as rich, and brightness that far surpasses it. This is fact.

2) Longevity. This point is short and sweet. Like any technology which operates similarly to Plasma and OLEDs, they deteriorate rapidly. There's nothing that can be done about this. So that display will lose brightness and color over time. I don't really need to go into any detail on this.

3) Burn-in. It's a very real problem. Became a "huge" problem from the time Plasma came out. And don't get me wrong...I love Plasma. Bought nothing but Plasma displays back then. But it means if I played Dragon Age for a few hours, the minimap display and other static items would burn in. And modern Plasmas had found ways of reducing the burn in, and then over time eliminating it. But it was still problematic, and only somewhat effective. So if you plan to play games which have static objects on the screen (crosshair, minimap, chatbox, general HUD items), or on your desktop with the taskbar, or anything really, you're really going to have to be super careful. And once mistake is all it'll take to ruin the display. 

LCD displays aren't just LCD panels. They're an advanced multi-component system now. And that's one of the flexibilities of LCD displays. You can put anything behind or in front of it to get the desired results. If you don't believe me, go look at one of the new Samsung QLED tvs. And compare it to the LG OLED. they're very close in terms of contrast/blackness. And once higher resolution FALD with mini/microled becomes a reality, OLED won't stand a chance. Simply because OLED would have nothing to offer. Except...believe it or not...cost effectiveness. LCD panels will surpass OLED quality in every way. But they will also become more expensive. So OLED will be the cheaper solution. And really the only solution for any type of mobile platform. Whether we're talking about phones, tablets, laptops, etc etc. But for true high end stationary display systems, LCD will be better.

Alright this got long enough, I think.


----------



## ttnuagmada

l88bastar said:


> Looking at an IPS with FALD from the sides accentuates the bloom. You gotta look at an IPS FALD head on.


Thats VA that does that. an IPS shouldn't bloom any more from an angle than from the front.


----------



## Nidogodva

I don't know if it's been mentioned but newegg now has the release date for July 13th.


----------



## Glerox

Nidogodva said:


> I don't know if it's been mentioned but newegg now has the release date for July 13th.


I think it's maybe because the first wave for June was sold out and they are taking preorders for the second wave.

I created a thread for future PG27UQ owners :

http://www.overclock.net/forum/44-monitors-displays/1700366-asus-pg27uq-owners.html#post27486924


We can either we use it or rename this actual thread.


----------



## subtec

HyperMatrix said:


> Now I'm not sure about what other coatings they are using on these displays. With Samsung TVs they use what I would best describe as a sort of tint layer over top of the panel, which prevents low level lights from shining through, allowing for the appearance of very deep and rich blacks. Samsung also uses glossy panel. Because if you have a panel that is bright enough, a glossy tinted black display is going to be your best option. One way to reduce the perceived FALD glow on these monitors, will be to remove the anti-glare/anti-reflective coating. I'm sure guides will come out for that later.
> 
> As for all the people wishing this were OLED, there's actually a very good reason why it's not. I'm about to ruffle some feathers here. But this needs to be said. OLED is inferior technology. It has limited room for growth. It's like Plasma. It was really nice. But had limitations. And couldn't evolve. And it will die out. OLED as a gaming display is actually a terrible idea. I can sense the keyboard warriors getting amped up and ready to hit that reply button to cuss me out. But I will soldier on. There are a few problems with OLED that can't be resolved:
> 
> 1) Brightness. Because each pixel creates its own light, it's limited in terms of how much light it can output. So while it can maintain a perfect contrast ratio, its maximum brightness will be limited. And yes...I'm going to get a lot of would-be videophiles saying that you don't really need that much maximum brightness because the brightness of OLED is more than sufficient in a light controlled environment. Which is technically true. If you're going to set up a theater room where it's pitch black, then yes. OLED will truly shine. And because the black pixels will literally be black, and your pupils are dilated due to the near pitch black room, the current level of OLED brightness can be bright enough to burn your retina. Fun times. For anyone who's not playing in a pitch black room, however, this is less than ideal. MicroLED is the future. Not OLED. Whether we're talking about a full MicroLED panel, or simply a MicroLED backlight system behind a standard IPS type display, both can be superior to OLED. High resolution MicroLED FALD behind a high resolution IPS, behind a Quantum Dot layer, behind additional tint/blackening layers, and you can end up with a TV that has the exact same perceivable contrast level as OLED, with colors that are just as rich, and brightness that far surpasses it. This is fact.
> 
> 2) Longevity. This point is short and sweet. Like any technology which operates similarly to Plasma and OLEDs, they deteriorate rapidly. There's nothing that can be done about this. So that display will lose brightness and color over time. I don't really need to go into any detail on this.
> 
> 3) Burn-in. It's a very real problem. Became a "huge" problem from the time Plasma came out. And don't get me wrong...I love Plasma. Bought nothing but Plasma displays back then. But it means if I played Dragon Age for a few hours, the minimap display and other static items would burn in. And modern Plasmas had found ways of reducing the burn in, and then over time eliminating it. But it was still problematic, and only somewhat effective. So if you plan to play games which have static objects on the screen (crosshair, minimap, chatbox, general HUD items), or on your desktop with the taskbar, or anything really, you're really going to have to be super careful. And once mistake is all it'll take to ruin the display.


All three of these issues with OLEDs can be mitigated, to the point where they could well become non-issues and on par with average product lifespan in the industry. All of these have to do with producing light - not enough, but at the same time producing more will directly accelerate ageing and burn-in.

Consider the way current white OLED (WOLED) displays (as used by LG in their TVs) operate: the OLED elements product a uniform white by mixing blue and yellow. From that white light, the primary red, green, and blue colors are isolated by filtering out the rest of the spectrum using color filters at the subpixel level. That filtered light is what you see when viewing an LG WOLED TV.

The problem? To show something like a bright, fully saturated red, the color filter is essentially blocking two-thirds of the light energy that's actually being emitted by the OLED to get that single pure color. That means the WOLED is being driven a lot harder to get to that color than if it were directly emitting each color, as in an RGB OLED.

Unfortunately, RGB OLED is still very expensive to produce compared to WOLED, and the lifetimes of the red, green, and blue OLED materials vary, with blue being the most unstable and shortest-lived. There are solutions being worked on for improving lifespan, but at the moment it's still an issue. Likewise cost, and there we see things like inkjet printed OLED in development. So we may yet see the lifespan problems solved (or sufficiently addressed) through these developments.

But back to the only current, (relatively) cheap way of making an OLED display: WOLED with color filters. It's clear that throwing out 70%+ of the light to produce bright, pure colors is problematic, but is there anything that can be done about it?

Enter quantum dots. Yeah, QDs are a buzzword all over LCDs, but that's only one limited application of the technology (as a backlight film to enhance color purity). The real exciting breakthrough comes with quantum dot color filters (QDCF), which replace the color filters used on an LCD or OLED. What's so special about that? Well, quantum dots have this neat ability to take a photon of one wavelength, and spit out a photon of another, specific, wavelength. So they can take blue light, for example, from a backlight or OLED element, and turn it into red, green, or blue - just like a normal color filter. The difference is, they can do it with _near 100% efficiency_.

I shouldn't need to tell you what that means, but just to make it clear: with quantum dot color filters on an OLED display, the OLED only needs to be driven a fraction as hard to produce the same visible light (for bright, saturated primary colors - typically the biggest burn-in offenders in channel logos etc). As a side benefit, you get purer colors from QDs than from traditional color-filtered WOLED.

So there's reason to think OLED's future is brighter (pun intended) than you might think. OLED does have some other advantages over LCD (and probably MicroLED, whenever that finally makes it to consumer displays): namely, it can be made extremely thin and flexible, allowing for things like roll-up and foldable displays, and sheet-thin displays you can put on a wall. Maybe electro-emissive QD will be a contender, but that seems even further away than MicroLED right now. The next decade plus, though, could well see OLED as the dominant technology across most display types.


----------



## HyperMatrix

subtec said:


> All three of these issues with OLEDs can be mitigated, to the point where they could well become non-issues and on par with average product lifespan in the industry. All of these have to do with producing light - not enough, but at the same time producing more will directly accelerate ageing and burn-in.
> 
> Consider the way current white OLED (WOLED) displays (as used by LG in their TVs) operate: the OLED elements product a uniform white by mixing blue and yellow. From that white light, the primary red, green, and blue colors are isolated by filtering out the rest of the spectrum using color filters at the subpixel level. That filtered light is what you see when viewing an LG WOLED TV.
> 
> The problem? To show something like a bright, fully saturated red, the color filter is essentially blocking two-thirds of the light energy that's actually being emitted by the OLED to get that single pure color. That means the WOLED is being driven a lot harder to get to that color than if it were directly emitting each color, as in an RGB OLED.
> 
> Unfortunately, RGB OLED is still very expensive to produce compared to WOLED, and the lifetimes of the red, green, and blue OLED materials vary, with blue being the most unstable and shortest-lived. There are solutions being worked on for improving lifespan, but at the moment it's still an issue. Likewise cost, and there we see things like inkjet printed OLED in development. So we may yet see the lifespan problems solved (or sufficiently addressed) through these developments.
> 
> But back to the only current, (relatively) cheap way of making an OLED display: WOLED with color filters. It's clear that throwing out 70%+ of the light to produce bright, pure colors is problematic, but is there anything that can be done about it?
> 
> Enter quantum dots. Yeah, QDs are a buzzword all over LCDs, but that's only one limited application of the technology (as a backlight film to enhance color purity). The real exciting breakthrough comes with quantum dot color filters (QDCF), which replace the color filters used on an LCD or OLED. What's so special about that? Well, quantum dots have this neat ability to take a photon of one wavelength, and spit out a photon of another, specific, wavelength. So they can take blue light, for example, from a backlight or OLED element, and turn it into red, green, or blue - just like a normal color filter. The difference is, they can do it with _near 100% efficiency_.
> 
> I shouldn't need to tell you what that means, but just to make it clear: with quantum dot color filters on an OLED display, the OLED only needs to be driven a fraction as hard to produce the same visible light (for bright, saturated primary colors - typically the biggest burn-in offenders in channel logos etc). As a side benefit, you get purer colors from QDs than from traditional color-filtered WOLED.
> 
> So there's reason to think OLED's future is brighter (pun intended) than you might think. OLED does have some other advantages over LCD (and probably MicroLED, whenever that finally makes it to consumer displays): namely, it can be made extremely thin and flexible, allowing for things like roll-up and foldable displays, and sheet-thin displays you can put on a wall. Maybe electro-emissive QD will be a contender, but that seems even further away than MicroLED right now. The next decade plus, though, could well see OLED as the dominant technology across most display types.


Yes the thin factor is good, and that's why I mentioned in different situations where thickness is important, it'll continue to play a part until replaced by MicroLED. But outside of that, it's not a superior technology to IPS. There are pros and cons. But the pathway to improvement for IPS is a bit more open. And if you throw enough money at the problem, you can actually get an IPS display to outperform the OLED in 9 out of 10 metrics, as we're starting to see in some of the new high end IPS tvs coming out. And this will be further enhanced as mini and microled backlight arrays improve. But that's just me.


----------



## toncij

JackCY said:


> 1.0x works.
> 
> 
> 
> Just get a 4k OLED TV at these prices and a TN strobed 144-240-...Hz monitor 1080p/1440p.
> 
> If these 27" 4k 120Hz monitors were OLED, one could understand their price but considering they are IPS and probably even without the cheap ATW polarizer to remove glow? A massive rip off.


1.0x is not usable on 27", it's borderline usable on 32".

And this display is still unavailable in Europe. Not a single of the large shops even offer it for preorder. Might be yet another Asus "not really there/we offer 14 pieces this year"...


----------



## rvectors

That was way too civil, I was all ready to shout bun fight, and people go and have a good cogent debate.


Ultimately, at least in the PC monitor market, what tech we get, will be what is the cheapest to build, with the greatest yields, whether or not it has some glaring 'defects', knowing that we the consumer, continue to throw our money at them regardless.


----------



## kc5vdj

toncij said:


> 1.0x is not usable on 27", it's borderline usable on 32".
> 
> And this display is still unavailable in Europe. Not a single of the large shops even offer it for preorder. Might be yet another Asus "not really there/we offer 14 pieces this year"...


I have to agree with that. I do find though that I am having no problem with 1.0x on this LG 43 inch. Granted, my refresh is a fixed 60 Hz, but for readability from a pretty standard chair to monitor distance, to me 43 inches seems to be a minimum. It still has a tighter dot pitch than the 24 inch TN that I gave to my wife (See the wife's rig below for my old monitor), but that makes things a little sharper, while at the same time readable at 1.0x, with the equiv being quad 21.5 inch FHD monitors for an idea of the dot pitch I have here.

43 inches should be a working minimum for 4K for anyone doing anything other than gaming, IMHO.


----------



## Glerox

Video showing the dimming zones and backlight responsiveness with PG27UQ. Note that the image is overexposed a lot to emphasize the effect.


----------



## MistaSparkul

My local Microcenter will have them in stock June 25th. I'm gonna try to reserve one hopefully I get lucky. In the Acer thread some people say the blooming isn't bad at all while others say it is so I really don't know who to believe. Guess I'll find out for myself.


----------



## kot0005

bloming is not bad on fast setting. That ips bleed on the right tho....


----------



## MistaSparkul

Microcenter went from a June 25th release date, to now sold out...***?


----------



## Sancus

For what it is, they did a good job on the FALD. "Gradual" is what you get on every other FALD ever made, pretty much. This one is the best ever made. It's just hampered by the IPS panel's inherently poor light blocking capability. The 35" VA versions are going to be very interesting if they come out next quarter as they're supposed to.


----------



## CallsignVega

My local Microcenter here in D.C. just got 10+ in stock. Going to pick one up and pit it against the X27.


----------



## tinykitten

It's available for about 2600€ in Europe/Germany now. Not sure if the price is justified to be honest. I'd be fine with 2000€ personally. 
https://geizhals.de/asus-rog-swift-pg27uq-90lm03a0-b01370-a1571346.html


----------



## CallsignVega

Battle Royale:





So after testing the ASUS versus the Acer, it is pretty clear that NVIDIA did basically all of the firmware on these displays. Almost every OSD item is identical, with slightly changed names for some of them. Both panels are pixel perfect and have no flaws. AUOptronics appears to have stepped up their game. Both displays operate in the same manner, have identical picture qualities, both come with calibration reports, both have the same bandwidth, both have the same resolution and color settings, both have the same AR film, both have identical motion clarity numbers etc. Really the only noticeable differences are the monitor case (housing), the fan setup and the stand.

The stand is nicer on the Acer. Look, feel and quality. I also prefer the simpler/more professional look of the Acer housing. 

While I stated in my Acer review the OSD joystick on the Acer felt kinda flimsy, it is downright terrible on the ASUS. I click up and down in the menu and the joystick gets stuck up or down and I have to manually center it. It also sometimes makes an odd clicking sound. Basically I don't think the joystick is set into its mount properly. Something they should have checked before packaging it up and can only be fixed if you take the monitor apart. The OSD on both monitors basically function the same and one isn't better than the other. 

The last discriminator is the fan. ASUS:






Acer:




The Acer is noticeable quieter. The Acer fan spins slower and is larger and has a larger heatsink. The ASUS fan is smaller, spins faster and makes a higher frequency sound that is more noticeable. The Acer's fan is more of a lower frequency/moving air sound. 

The Acer stays on my desk and the ASUS boxed up for return. Please let me know if you have any questions about the differences between the two!


----------



## KGPrime

Odd. The Asus fan is backwards, plus strangled, it would appear effectively useless.


----------



## lb_felipe

Does that monitor have cooling fan(s) like X27 does?


----------



## l88bastar

WOOOOOooooo

glad I got an X27!


----------



## kx11

interesting !!!


this person claims Acer x27 got a loud fan !!


----------



## subtec

KGPrime said:


> Odd. The Asus fan is backwards, plus strangled, it would appear effectively useless.


It's a blower fan, so it's pushing air perpendicular to its axis, likely through a heatsink we can't see. Blowers also produce greater static pressure than axial fans, so intake restriction isn't as big of a deal.


----------



## l88bastar

kx11 said:


> interesting !!!
> 
> 
> this person claims Acer x27 got a loud fan !!
> 
> 
> 
> 
> https://www.youtube.com/watch?v=3YCldvmZ6QA


That video and reviewer have already been debunked.

He purposely or ignorantly installed the mounting bracket backwards and was stiffling the fans airflow making it ramp up louder.

Also, you can see he has an Ultra Wide display as his primary. NEVER trust a reviewer who uses an UW!


----------



## Blackvette94

Just got the Asus PG27UQ!

Time to try it out and then remove the Anti Glare filter!

More pics to follow of the taking it apart process and AG removal tonight :0


----------



## KGPrime

I wouldn't bother taking it apart, just tape off the bezel and use a sandblaster.


----------



## saltedham

can we expect a fan on every monitor of these sorts, or can passive heat exhausting be done.


----------



## CallsignVega

As connectivity bandwidth and processing speeds increase, with FALD / micro LED's becoming more common I'd say fans will be more common. There are very real limits as to what passive cooling can handle, especially in the small space in the back of a monitor housing.


----------



## Blackvette94

KGPrime said:


> I wouldn't bother taking it apart, just tape off the bezel and use a sandblaster.


I know your kidding... 


I have taken apart dozens of monitors and tvs to remove ag filters, so no worries here.


----------



## kot0005

saltedham said:


> can we expect a fan on every monitor of these sorts, or can passive heat exhausting be done.


I thiunk they will be using fans in these for a few generations or until Nvidia can make a better G-sync chip


----------



## l88bastar

Blackvette94 said:


> I know your kidding...
> 
> 
> I have taken apart dozens of monitors and tvs to remove ag filters, so no worries here.


via Imgflip Meme Generator


----------



## KGPrime

KGPrime said:


> I wouldn't bother taking it apart, just tape off the bezel and use a sandblaster.


Yeah i'm just joking.,I would want to do it too, depending on how matte it is. Actually excited to see it.


----------



## kx11

more interesting news


color compression @ 120/144hz 



http://www.guru3d.com/news-story/as...ed-to-use-color-compression-at-120144-hz.html


----------



## CallsignVega

Whoever wrote that story is an idiot. There is no "color compression" at 120 Hz. 

"This also means you pretty much need to run your Windows desktop at 60 Hz for a bit of a quality readable view. "

LOL What is this fool talking about?


----------



## sblantipodi

tinykitten said:


> It's available for about 2600€ in Europe/Germany now. Not sure if the price is justified to be honest. I'd be fine with 2000€ personally.
> https://geizhals.de/asus-rog-swift-pg27uq-90lm03a0-b01370-a1571346.html


how can it be justified?
a monitor that costs more than 1000€ should have the word 3D LUT and hardware calibration in the specs.

those are gaming toy sold as professional one.


----------



## sblantipodi

is there some professional reviews on those monitors (X27 and PG27UQ) ?
I see no reviews on the net.


----------



## MistaSparkul

CallsignVega said:


> Whoever wrote that story is an idiot. There is no "color compression" at 120 Hz.
> 
> "This also means you pretty much need to run your Windows desktop at 60 Hz for a bit of a quality readable view. "
> 
> LOL What is this fool talking about?


Well for HDR it does drop to 4:2:2 once you're above 98Hz but running in 8 bit SDR then ya 120Hz is full RGB 4:4:4. He should've clarified that because the way it's written is totally misleading people into thinking 120Hz is gonna run at 4:2:2 no matter what.


----------



## lumbeechief

Can anybody confirm HDR actually works on Fortnite on PC? I was messing around with the games config file trying to get 3840x2160 to work in fullscreen mode, sadly it only works in fullscreen window, but I found HDR listed in the settings in the process. https://s8.postimg.cc/gondinh77/fgh.png


----------



## kot0005

kx11 said:


> more interesting news
> 
> 
> color compression @ 120/144hz
> 
> 
> 
> http://www.guru3d.com/news-story/as...ed-to-use-color-compression-at-120144-hz.html


old news ..its been there since dp 1.4.. I dont know why they would use DSC instead


----------



## kot0005

Got my UQ today. Its Amazing!! I have the same crappy Joystick as Vega. Its a bit flimsy. THe PG279Q has a much better joystick.....
HDR in games is just amazing. I played HZD/GOW and Destiny 2!!


----------



## badjz

kot0005 said:


> Got my UQ today. Its Amazing!! I have the same crappy Joystick as Vega. Its a bit flimsy. THe PG279Q has a much better joystick.....
> HDR in games is just amazing. I played HZD/GOW and Destiny 2!!


Also received mine today. Very impressed with HDR.

Does gears of war 4 on pc have HDR?

What preset are you using; fps, cinema, scenery, etc...


----------



## kot0005

badjz said:


> Also received mine today. Very impressed with HDR.
> 
> Does gears of war 4 on pc have HDR?
> 
> What preset are you using; fps, cinema, scenery, etc...


I didn't find any presets.. i am using racing mode.

Goww4 doesnt for PC. Also turn off windows hdr 10. Set to 10bit in nvidia cp.


----------



## kot0005

also cant hear the fan unless i stick my ear to the back of the monitor. THe blacks r really black with FALD. blooming is barely visible in dark backgrounds. In game I did not see any blooming at all. even in the menus. I haven't tried dark games tho.


----------



## badjz

kot0005 said:


> I didn't find any presets.. i am using racing mode.
> 
> Goww4 doesnt for PC. Also turn off windows hdr 10. Set to 10bit in nvidia cp.


To set to 10bit i need to lower the refresh rate, correct?


----------



## kot0005

badjz said:


> To set to 10bit i need to lower the refresh rate, correct?


nop I still had option to set it to 120/144hz. I think it wll just reduce the chroma and change the depth dynamically


----------



## kot0005

https://www.youtube.com/edit?o=U&video_id=-1JmcOzFQ2M


AMD is getting on too, he says more hdr pc games r coming..


----------



## Glerox

kot0005 said:


> https://www.youtube.com/edit?o=U&video_id=-1JmcOzFQ2M
> 
> 
> AMD is getting on too, he says more hdr pc games r coming..
> 
> 
> https://www.youtube.com/watch?v=-1J...qshzveh0uun1ng34m4vrk0h00410.1529480082158847


nice! You've set it to 98hz for 10bits on desktop. do you see a difference between 10 bits and 8 bits in SDR? I was thinking letting mine at 8bits 120hz.

EDIT : Also i see you put it in Racing mode. Don't you have to put it in sRGB mode to benefit from the Asus manufacture calibration?

EDIT : You like darkboost??


----------



## Maxxamillion

Does the fan ever stop spinning unless you unplug the power cord?


----------



## Monstieur

It's better to run most displays in 8-bit RGB mode for both HDR and SDR content. When an application renders to a 10-bit DirectX 11 surface, the NVIDIA driver automatically performs 10-bit to 8-bit dithering. This dithered 8-bit output is almost always superior to a native 10-bit signal on most displays and will even have less banding. Even when watching HDR movies with madVR, 8-bit RGB is superior. 10-bit sub sampled YCbCr422 should only be used as a last resort. It's useful on the HDMI port because consoles and Blu-ray players don't support HDR with RGB.

You should leave the desktop at 8-bit RGB. HDR games should also run in this mode since they still render to a 10-bit surface. There is no benefit to 10-bit mode on the desktop the way it's currently implemented.


----------



## thewanted

Maxxamillion said:


> Does the fan ever stop spinning unless you unplug the power cord?


Someone on Reddit said that the fan stops spinning once the monitor is cool enough (~10 minutes).

EDIT: Doh, I think they were referring to the Acer X27.


----------



## kot0005

Glerox said:


> nice! You've set it to 98hz for 10bits on desktop. do you see a difference between 10 bits and 8 bits in SDR? I was thinking letting mine at 8bits 120hz.
> 
> EDIT : Also i see you put it in Racing mode. Don't you have to put it in sRGB mode to benefit from the Asus manufacture calibration?
> 
> EDIT : You like darkboost??


Darkboost only works in windows 10 with HDR off. It only changes grey levels.

sRGB had a yellow tint to on white colors. I think it ships with sRGB mode on. So i changed it to racing. Windows 10 HDR will lock your settings anyway.

10bit does make some stuff appear smoother without banding, like wallpapers etc if they were originally in 10bit.


----------



## Monstieur

The Windows desktop does not render in 10-bit so the wallpaper cannot be affected.


----------



## profundido

Monstieur said:


> It's better to run most displays in 8-bit RGB mode for both HDR and SDR content. When an application renders to a 10-bit DirectX 11 surface, the NVIDIA driver automatically performs 10-bit to 8-bit dithering. This dithered 8-bit output is almost always superior to a native 10-bit signal on most displays and will even have less banding. Even when watching HDR movies with madVR, 8-bit RGB is superior. 10-bit sub sampled YCbCr422 should only be used as a last resort. It's useful on the HDMI port because consoles and Blu-ray players don't support HDR with RGB.
> 
> You should leave the desktop at 8-bit RGB. HDR games should also run in this mode since they still render to a 10-bit surface. There is no benefit to 10-bit mode on the desktop the way it's currently implemented.


insightful. thx


----------



## CallsignVega

Monstieur said:


> It's better to run most displays in 8-bit RGB mode for both HDR and SDR content. When an application renders to a 10-bit DirectX 11 surface, the NVIDIA driver automatically performs 10-bit to 8-bit dithering. This dithered 8-bit output is almost always superior to a native 10-bit signal on most displays and will even have less banding. Even when watching HDR movies with madVR, 8-bit RGB is superior. 10-bit sub sampled YCbCr422 should only be used as a last resort. It's useful on the HDMI port because consoles and Blu-ray players don't support HDR with RGB.
> 
> You should leave the desktop at 8-bit RGB. HDR games should also run in this mode since they still render to a 10-bit surface. There is no benefit to 10-bit mode on the desktop the way it's currently implemented.


Correct. You only utilize 10-bit if the entire workflow stream is 10-bit, like Photoshop. People automatically assume setting 10-bit on the desktop and games does anything, which it doesn't. 

Running HDR on these monitors at 120 Hz 4K RGB Full 8-Bit works just fine. No reason to go to 98 Hz.


----------



## profundido

CallsignVega said:


> Correct. You only utilize 10-bit if the entire workflow stream is 10-bit, like Photoshop. People automatically assume setting 10-bit on the desktop and games does anything, which it doesn't.
> 
> Running HDR on these monitors at 120 Hz 4K RGB Full 8-Bit works just fine. No reason to go to 98 Hz.



Question: when you launch an HDR enabled game while having set 120 Hz 4K RGB Full 8-Bit, does that mean that it's 10-bit colors output will get reduced to 8-bit and you do not experience the full beauty of the game ?


----------



## kot0005




----------



## Sichtwechsel86

CallsignVega said:


> Correct. You only utilize 10-bit if the entire workflow stream is 10-bit, like Photoshop. People automatically assume setting 10-bit on the desktop and games does anything, which it doesn't.
> 
> Running HDR on these monitors at 120 Hz 4K RGB Full 8-Bit works just fine. No reason to go to 98 Hz.


Oh, well there would be a reason - if HDR10 is used as the HDR standard...
and UHD movies are mastered with 10bit color-depth...(HDR10 and HDR10+) or even in 12bit (Dolby Vision)

so it depends very much on source-material...

but i find it difficult to find out, if textures in games, shaders, effects, etc... are all rendered in 10bit...
for most games we don't even know if audiofiles are 16bit, 24bit and what samplingrate they use... (44.1khz or 48khz or higher)
for most games it seems to be a mixture of low-res and high-res data... for audio-files and textures...
so it very much depends on the whole workflow used to bring the specific game to live and what of it is the actual data that will be installed on our systems...

does anyone even know if colors are generated in realtime in games?
do they scale to the capabilities of our screens...?
what colorspace do games use?
i know that - if using HDR10 - they are forced to master in rec2020 (or DCI-P3) - and normally in sRGB...
but is it a dynamical process, that the GPU does, in real-time, depending on what screen one uses??


----------



## Monstieur

profundido said:


> Question: when you launch an HDR enabled game while having set 120 Hz 4K RGB Full 8-Bit, does that mean that it's 10-bit colors output will get reduced to 8-bit and you do not experience the full beauty of the game ?


No, the wide gamut colour space remains the same in both 8-bit and 10-bit mode. It's only important that the source content is 10-bit, so that it does not produce banding at the source. The signal to the display does not need to be 10-bit if the driver performs dithering. A dithered 8-bit RGB signal to the display is superior to a 10-bit YCbCr422 subsampled signal in practically every case, except for Blu-rays where the content is already YCbCr422. Even with Blu-rays, the PC may do a better job at reducing banding in 8-bit dithered mode than the display in 10-bit mode.

If a games uses fullscreen exclusive mode for HDR it will override your setting and could end up using 10-bit YCbCr.



CallsignVega said:


> Correct. You only utilize 10-bit if the entire workflow stream is 10-bit, like Photoshop. People automatically assume setting 10-bit on the desktop and games does anything, which it doesn't.
> 
> Running HDR on these monitors at 120 Hz 4K RGB Full 8-Bit works just fine. No reason to go to 98 Hz.


Even in a true 10-bit workflow, the display's rendition of a 10-bit signal is usually inferior to a dithered 8-bit signal. If you have a true 10-bit display, you should use a 12-bit workflow with dithering.



Sichtwechsel86 said:


> Oh, well there would be a reason - if HDR10 is used as the HDR standard...
> and UHD movies are mastered with 10bit color-depth...(HDR10 and HDR10+) or even in 12bit (Dolby Vision)
> 
> so it depends very much on source-material...
> 
> but i find it difficult to find out, if textures in games, shaders, effects, etc... are all rendered in 10bit...
> for most games we don't even know if audiofiles are 16bit, 24bit and what samplingrate they use... (44.1khz or 48khz or higher)
> for most games it seems to be a mixture of low-res and high-res data... for audio-files and textures...
> so it very much depends on the whole workflow used to bring the specific game to live and what of it is the actual data that will be installed on our systems...
> 
> does anyone even know if colors are generated in realtime in games?
> do they scale to the capabilities of our screens...?
> what colorspace do games use?
> i know that - if using HDR10 - they are forced to master in rec2020 (or DCI-P3) - and normally in sRGB...
> but is it a dynamical process, that the GPU does, in real-time, depending on what screen one uses??


You do not need a 10-bit signal to the display, especially when it comes at the cost of subsampling on PC content. Only the content needs to be 10-bit. The 10-bit YCbCr422 mode only exists because it prevents an unnecessary conversion when watching Blu-rays on a dedicated player. On a PC, even movies are converted to RGB at some point in the chain. Consoles follow the standards set by the movie industry and also use 10-bit YCbCr in HDR mode even though they are capable of RGB in SDR mode.


----------



## Monstieur

The increased brightness from HDR causes the banding in content to become more visible. That's why HDR content is mastered in 10-bit - it's not due to the colour space. With Dolby Vision's 10,000 nit brightness even 10-bit is insufficient to prevent banding, so it requires 12-bit. This doesn't mean that the signal to the display must be 10-bit to prevent banding - a dithered 8-bit signal usually looks better than an undithered 10-bit signal.


----------



## tinykitten

I received a PG27UQ a few hours ago. I love the image quality, it's definitely a big step compared to what was available before. I went through 3 PG348Q and 4 PG279Q, all of them had issues (mostly severe backlight bleed as widely known, a few units with dead pixels also). Comparing my current PG279Q, which is ok overall (lucky unit), to this PG27UQ is like day and night in games like lets say RE7. I might be cursed but this PG27UQ has two groups with two dead pixels each so unfortunately I will have to return it.


----------



## Glerox

Thanks Monstieur! (Rep point feature disappeared?)

Edit : Been waiting 2 years (Computex 2016) for one of these. Now each time I read one gets his, it's like waiting another month lol!

Come on newegg canada!!!


----------



## Maxxamillion

As soon as I plug in the power cord the fan starts spinning and does not seem to stop unless I unplug the power cord. Is this normal?


----------



## CallsignVega

Sichtwechsel86 said:


> Oh, well there would be a reason - if HDR10 is used as the HDR standard...
> and UHD movies are mastered with 10bit color-depth...(HDR10 and HDR10+) or even in 12bit (Dolby Vision)


And UHD movies are also only 4:2:0 chroma. This monitor can do HDR all the way up to 144 Hz at 10-bit with 4:2:2 chroma, so the point is fairly moot. The reduction to 4:2:2 chroma in games is completely insignificant. Only on the desktop do you have to drop down to 120 Hz in order to run RGB Full.


----------



## kot0005

Just upgraded windows 10 1803 build from old 1709. You can control sdr brightness in windows now while having hdr on.i had to set it to 25in my room with a Philips 1400lumen led and 16 without. cranking it all the way up is tooo bright..I think it goes to like 400nits?!?! someone has to test it ..


----------



## kot0005

Maxxamillion said:


> As soon as I plug in the power cord the fan starts spinning and does not seem to stop unless I unplug the power cord. Is this normal?



I recommend you to not unplug the chord to make the fan stop..its spinning for a reason..

If ur fan is too loud you might have to get a replacement.

it will stop spinning if you shutdown ur pc, take a few minutes so it can cooldown what ever it is trying to cool.


----------



## MistaSparkul

Maxxamillion said:


> As soon as I plug in the power cord the fan starts spinning and does not seem to stop unless I unplug the power cord. Is this normal?


Acer X27 does the same thing. It will keep running until the monitor is cool enough. Usually about 5-10 minutes after shutting down your pc.


----------



## CallsignVega

kot0005 said:


> Just upgraded windows 10 1803 build from old 1709. You can control sdr brightness in windows now while having hdr on.i had to set it to 25in my room with a Philips 1400lumen led and 16 without. cranking it all the way up is tooo bright..I think it goes to like 400nits?!?! someone has to test it ..


Do that with a white full-screen for a shock effect. It is more like 600 nits.


----------



## moonbogg

CallsignVega said:


> Correct. You only utilize 10-bit if the entire workflow stream is 10-bit, like Photoshop. People automatically assume setting 10-bit on the desktop and games does anything, which it doesn't.
> 
> Running HDR on these monitors at 120 Hz 4K RGB Full 8-Bit works just fine. No reason to go to 98 Hz.



I thought one of the main benefits of HDR is the increased color depth. If people use 8 bit, does that still mean they get more colors than an SDR screen? I'm getting the impression that the contrast, black levels, and detailed brightness are far more important than having more colors. True or not true?


----------



## Glerox

kot0005 said:


> Just upgraded windows 10 1803 build from old 1709. You can control sdr brightness in windows now while having hdr on.i had to set it to 25in my room with a Philips 1400lumen led and 16 without. cranking it all the way up is tooo bright..I think it goes to like 400nits?!?! someone has to test it ..


I would think that HDR turned ON in desktop is now usable with the new Windows build because you can finally control brightness. 

However, after reading all the recent threads concerning the new monitors, my understanding is that when HDR is turned ON on desktop, Windows displays the SDR/sRGB content through a HDR10 standard signal, which means 10bits 4:2:2 chroma subsampling at 120hz, which means unusable for regular desktop work and text.

So if I understand well, it seems that we will still have to annoyingly toggle the HDR ON/OFF button in Windows when working VS gaming/watching HDR youtube or movies, unless you stick to 98Hz?


----------



## Exilon

moonbogg said:


> I thought one of the main benefits of HDR is the increased color depth. If people use 8 bit, does that still mean they get more colors than an SDR screen? I'm getting the impression that the contrast, black levels, and detailed brightness are far more important than having more colors. True or not true?


Color in LCD is what we see based on the brightness of the three color channels per pixel. With 8-bit depth, it means there's 256 steps of brightness per channel. 10-bit is 1024 steps of brightness per channel. Mixing the three channels together gives us ~16.7M and ~1B colors that's quoted on the box.

Contrast and black levels are measured at the top and bottom of each channel, so dropping to 8-bits would not affect them. Similarly, color gamut is also not affected.

Monitors convert the 8/10-bit numbers to pixel color brightness based on a gamma curve. HDR standards and SDR have different curve definitions. https://www.smpte.org/sites/default/files/section-files/HDR.pdf


----------



## CallsignVega

Glerox said:


> I would think that HDR turned ON in desktop is now usable with the new Windows build because you can finally control brightness.
> 
> However, after reading all the recent threads concerning the new monitors, my understanding is that when HDR is turned ON on desktop, Windows displays the SDR/sRGB content through a HDR10 standard signal, which means 10bits 4:2:2 chroma subsampling at 120hz, which means unusable for regular desktop work and text.
> 
> So if I understand well, it seems that we will still have to annoyingly toggle the HDR ON/OFF button in Windows when working VS gaming/watching HDR youtube or movies, unless you stick to 98Hz?


There is absolutely no reason to turn on HDR on the desktop unless you are viewing HDR content. You would simply get washed out colors since the color space is different. And Windows doesn't care about the color bit depth when you turn on HDR. 120 Hz RGB Full 8-bit HDR works fine.


----------



## Baasha

Just got my Asus RoG Swift PG27UQ today.

Does DSR not work on this display? I can set the desktop resolutions to whatever using DSR but in any game, I can't set the resolutions beyond the native one (even if I change the desktop resolution to something else). It's strange because my 4K OLED monitor does this without issue.

My first impressions of the monitor:

1.) No dead/stuck pixels - well built and sturdy. However, less than amazing. May be I'm too jaded or spoiled - the 4K OLED image looks WAY better than the ones on this monitor.

2.) HDR mode (by turning it on in Win 10) just makes everything dimmer/darker. Turned it off and am back to SDR @ 8-bit on the desktop now.

3.) The 27" is a huge downgrade from my 30" 4K OLED. The OLED panel looks way better than this monitor - both externally and image-wise. I need to use this some more to see if that changes.

4.) Going from my 1440P 144Hz RoG Swift to this is an incremental step up in terms of experience, at least for me. The 1440P monitor is the same size and yes, the 4K image is much sharper but I'm not sure the difference is jaw-dropping.

Having said that, will play around with it some more.

Do you guys know where to get the proper driver for this monitor? The Asus Support page doesn't seem to have it (the dropdown for the OS is blank for me). Current driver is a "Generic PnP Monitor" which is ... not ideal.

Also, what brightness and contrast settings are you guys running it at? There are so many other options that I'm not sure what to use for the best experience - Dark Boost, Variable Backlight, Auto Black Level etc. What should those be set to to get the best image quality/experience?

Last but not least, the stand is hideous and thankfully that stupid light can be disabled. 

I wish Dell would make a 4K 144Hz monitor - they are hands down the best in the monitor business IME.


----------



## kx11

raise brightness/contrast when you turn HDR on 





TVs fix that automatically while PC monitors don't like my Samsung HDR freesync monitor


----------



## l88bastar

Baasha said:


> Just got my Asus RoG Swift PG27UQ today.
> 
> Also, what brightness and contrast settings are you guys running it at? There are so many other options that I'm not sure what to use for the best experience - Dark Boost, Variable Backlight, Auto Black Level etc. What should those be set to to get the best image quality/experience?


Dark Boost = no
Variable Backlight = yes, God yes.
Auto Black Level = useless does not do anything.

WAIT A SEC....YOUR BAAAAAASSHHHAAAAA!!!!! WHERE ARE YOUR OTHER TWO PG27UQs????

Ohh also, has it been established if BlackVette is teh Frahhhh.....Frauhhhhh.....FrAwwwww????


----------



## deadchip12

Baasha said:


> Just got my Asus RoG Swift PG27UQ today.
> 
> Does DSR not work on this display? I can set the desktop resolutions to whatever using DSR but in any game, I can't set the resolutions beyond the native one (even if I change the desktop resolution to something else). It's strange because my 4K OLED monitor does this without issue.
> 
> My first impressions of the monitor:
> 
> 1.) No dead/stuck pixels - well built and sturdy. However, less than amazing. May be I'm too jaded or spoiled - the 4K OLED image looks WAY better than the ones on this monitor.
> 
> 2.) HDR mode (by turning it on in Win 10) just makes everything dimmer/darker. Turned it off and am back to SDR @ 8-bit on the desktop now.
> 
> 3.) The 27" is a huge downgrade from my 30" 4K OLED. The OLED panel looks way better than this monitor - both externally and image-wise. I need to use this some more to see if that changes.
> 
> 4.) Going from my 1440P 144Hz RoG Swift to this is an incremental step up in terms of experience, at least for me. The 1440P monitor is the same size and yes, the 4K image is much sharper but I'm not sure the difference is jaw-dropping.
> 
> Having said that, will play around with it some more.
> 
> Do you guys know where to get the proper driver for this monitor? The Asus Support page doesn't seem to have it (the dropdown for the OS is blank for me). Current driver is a "Generic PnP Monitor" which is ... not ideal.
> 
> Also, what brightness and contrast settings are you guys running it at? There are so many other options that I'm not sure what to use for the best experience - Dark Boost, Variable Backlight, Auto Black Level etc. What should those be set to to get the best image quality/experience?
> 
> Last but not least, the stand is hideous and thankfully that stupid light can be disabled.
> 
> I wish Dell would make a 4K 144Hz monitor - they are hands down the best in the monitor business IME.


hmm I expect the monitor to be as good as OLED at least in brightly lit room. But you say OLED looks way better...


----------



## kot0005

CallsignVega said:


> Do that with a white full-screen for a shock effect. It is more like 600 nits.


oh wow ok so it does go b eyond the normal 300nit brightness. Even 15 was a bit high, I set it to 14 now in a full lit room.



Glerox said:


> I would think that HDR turned ON in desktop is now usable with the new Windows build because you can finally control brightness.
> 
> However, after reading all the recent threads concerning the new monitors, my understanding is that when HDR is turned ON on desktop, Windows displays the SDR/sRGB content through a HDR10 standard signal, which means 10bits 4:2:2 chroma subsampling at 120hz, which means unusable for regular desktop work and text.
> 
> So if I understand well, it seems that we will still have to annoyingly toggle the HDR ON/OFF button in Windows when working VS gaming/watching HDR youtube or movies, unless you stick to 98Hz?



I just switched it to 120Hz in NVCP and the color depth auto changed to 8bit full chroma according to my monitor info.

this is with HDR on in windows


----------



## kot0005

hmmm Asus gives u 4 spacers, 12mm I think, for the vesa mount but this guy isnt using them...







people dont install it right and complain about noise..


----------



## Vlada011

Sempre said:


> Finally. Wish it was 32" though.



That's must have. Nothing without 32".
32-34" are perfect gaming size.
They will launch and them only later and with higher price, 100%.


----------



## Glerox

kot0005 said:


> I just switched it to 120Hz in NVCP and the color depth auto changed to 8bit full chroma according to my monitor info.
> 
> this is with HDR on in windows


oh ok good, I thought HDR in Windows automatically changes to 10bit but I guess it just games and movies that send the 10bit signal in HDR.

But yeah, aneways like Vega said, HDR is no sense for desktop use so will have to continue to toggle ON/OFF when watching HDR on youtube which is annoying.


----------



## kot0005

just tried SWBF2 arcade with HDR. I think this game has good HDR implementation but some maps might not be playable because of so many bright objects the map I got had too many , even reflections on the floor..


----------



## kot0005

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1529578205

HDR 10+


----------



## Baasha

l88bastar said:


> Dark Boost = no
> Variable Backlight = yes, God yes.
> Auto Black Level = useless does not do anything.
> 
> WAIT A SEC....YOUR BAAAAAASSHHHAAAAA!!!!! WHERE ARE YOUR OTHER TWO PG27UQs????
> 
> Ohh also, has it been established if BlackVette is teh Frahhhh.....Frauhhhhh.....FrAwwwww????


 I'm done with Surround. Plus Shadowplay doesn't work in Surround, borders shmorders meh.. I'm good with one (for now anyway). 



deadchip12 said:


> hmm I expect the monitor to be as good as OLED at least in brightly lit room. But you say OLED looks way better...


Yea, I too expected it to be so. HDR works on BF1 and SWBF 2 - but the OLED image looks way better (especially next to each other).

The other main thing that concerns me is that SLI doesn't work as well (3 or 4 way) with this monitor. I have a feeling it has to do with the G-Sync module.

EDIT: obligatory pic










If I disable G-Sync, I can enable DSR resolutions in game but not otherwise.

Scaling on the GPUs suck compared to the OLED (or any other monitor I've used). I'm using the same DP 1.4 cable (that I used to use on my 8K monitor last year).

Sigh... more testing/impressions to come but as they say, first impression is the best impression and so far, this monitor has been absolutely mediocre IME.


----------



## kx11

Baasha said:


> If I disable G-Sync, I can enable DSR resolutions in game but not otherwise.
> 
> Scaling on the GPUs suck compared to the OLED (or any other monitor I've used). I'm using the same DP 1.4 cable (that I used to use on my 8K monitor last year).
> 
> Sigh... more testing/impressions to come but as they say, first impression is the best impression and so far, this monitor has been absolutely mediocre IME.





use C.R.U to break the resolution wall , just add 3840x2160 in this app and pretty much NVCP will allow tons of resolutions 



DL
https://www.monitortests.com/end/cru-1.3.99-p1.zip


----------



## l88bastar

Baasha said:


> Sigh... more testing/impressions to come but as they say, first impression is the best impression and so far, this monitor has been absolutely mediocre IME.


I LUB ME SO OLED....butt.....
Different strokes for different folks....I had that 30" Dell OLED for less than a day before it got boxed up and sent back. I could not take its double strobing @ 60hz, man that drove my eyes NUTS! Plus it has moon input lag....defiantly prefer my C7 OLED betttaaaahhhhhhh

I prefer the X27 as 120hz 4k is awesome for work productivity AND I don't have to worry about the BRAZZERS logo burning in my screen!


----------



## Baasha

l88bastar said:


> I LUB ME SO OLED....butt.....
> Different strokes for different folks....I had that 30" Dell OLED for less than a day before it got boxed up and sent back. I could not take its double strobing @ 60hz, man that drove my eyes NUTS! Plus it has moon input lag....defiantly prefer my C7 OLED betttaaaahhhhhhh
> 
> I prefer the X27 as 120hz 4k is awesome for work productivity AND I don't have to worry about the BRAZZERS logo burning in my screen!


LOOOOOOL

True.. I think 4K 120Hz OLED would be the best of both worlds.

You keep it at 120hz on desktop? Also, do you turn on HDR in Win 10 or just leave it at SDR and let the program/game do its thing with HDR (like BF1 etc.)?


----------



## Baasha

kx11 said:


> use C.R.U to break the resolution wall , just add 3840x2160 in this app and pretty much NVCP will allow tons of resolutions
> 
> 
> 
> DL
> https://www.monitortests.com/end/cru-1.3.99-p1.zip


thanks but I got DSR to work if I turn off G-Sync.. mainly wanted to check GPU scaling but it's still much worse than the other monitor. Seems weird. hmm....


----------



## kot0005

hmm I just noticed that the monitor doesnt shutdown completely, the underlight stays on and so does the fan. Woke up this morning and they were both on. I disabled fast start in windows but nothing changed.

anyone else had this happen ?


----------



## kot0005

nvm it fixed itself somehow. Played SWBF2 and god HDR is just on a whole new level..the lighting , the colors are so much lively....Just forget Haloing...u will not notice it during gameplay at all..


----------



## Monstieur

Baasha said:


> Yea, I too expected it to be so. HDR works on BF1 and SWBF 2 - but the OLED image looks way better (especially next to each other).


I just got an LG OLED. Would you recommend spending another $3000 for the PG27UQ just for HDR + G-SYNC, or should I just play HDR games on my OLED without G-SYNC?


----------



## Kommando Kodiak

once you g-sync you cant go back tearing is gone


----------



## Monstieur

Kommando Kodiak said:


> once you g-sync you cant go back tearing is gone


I have multiple G-SYNC monitors like the PG278Q, PG27VQ, and PG258Q. I just don't have HDR + G-SYNC which is why I want the PG27UQ.

Is the HDR on the PG27UQ better than the LG OLED? Even if it's not, does HDR + G-SYNC make up for it? When I watch HDR movies on the OLED and see fine detail like the lightning on Thor being perfectly illuminated on OLED, I find it hard to imagine I'd be satisfied with the big FALD zones.


----------



## MistaSparkul

Monstieur said:


> I have multiple G-SYNC monitors like the PG278Q, PG27VQ, and PG258Q. I just don't have HDR + G-SYNC which is why I want the PG27UQ.
> 
> Is the HDR on the PG27UQ better than the LG OLED? Even if it's not, does HDR + G-SYNC make up for it? When I watch HDR movies on the OLED and see fine detail like the lightning on Thor being perfectly illuminated on OLED, I find it hard to imagine I'd be satisfied with the big FALD zones.


That depends. It can get brighter so I guess bright HDR scenes might win over OLED. Dark HDR scenes though? Forget it.


----------



## kot0005

Kommando Kodiak said:


> once you g-sync you cant go back tearing is gone


This. even with fast sync/vsync its just not as good as g-sync.


----------



## Vipu

People really thought this gonna have as good image than OLED:s?


----------



## Sichtwechsel86

I got an interesting phone call from a retailer... (alternate.de)

He said they have plenty of PG27UQ in stock - but ASUS blocked them to sell these, as they would need a firmware-update...

Also they had to write to people who already received their PG27UQ to send it back for the update-process - as it is impossible for endusers to update the firmwae themselves!

And it would take time until the new update is ready (its in developement right now!) and flashed onto the units!
The alternate.de service told me they plan on selling the monitors again at the end of july - when the process is finished!
They are waiting for the update from Asus and then have to flash firmware for every single unit!

And i also checked the site - the PG27UQ has vanished completely from their site - 
but hours ago it was there, in stock and sold for 2599€

So it seems, that Asus knows about some problems and some users were right when saying:
early adopters are beta-testers! 

Has anyone of you received an email to send your monitors back for updating the firmware??


----------



## HyperMatrix

Sichtwechsel86 said:


> I got an interesting phone call from a retailer... (alternate.de)
> 
> He said they have plenty of PG27UQ in stock - but ASUS blocked them to sell these, as they would need a firmware-update...
> 
> Also they had to write to people who already received their PG27UQ to send it back for the update-process - as it is impossible for endusers to update the firmwae themselves!
> 
> And it would take time until the new update is ready (its in developement right now!) and flashed onto the units!
> The alternate.de service told me they plan on selling the monitors again at the end of july - when the process is finished!
> They are waiting for the update from Asus and then have to flash firmware for every single unit!
> 
> And i also checked the site - the PG27UQ has vanished completely from their site -
> but hours ago it was there, in stock and sold for 2599€
> 
> So it seems, that Asus knows about some problems and some users were right when saying:
> early adopters are beta-testers!
> 
> Has anyone of you received an email to send your monitors back for updating the firmware??


It seems odd that alternate.de would be cancelling the orders before Newegg or BestBuy. I switched my preorder from Newegg to BestBuy a few hours ago so at least if there's a problem I can just bring it back to the store and do a swap. I've also never understood why these high end monitors don't come with some type of PC connectivity, whether for firmware updates, to be able to upload your own crosshair graphics to use with the OSD, or to have the monitor directly feed screen color data to devices like LightPacks/AmbiBox without having to use a software screencapture method.


----------



## kx11

Sichtwechsel86 said:


> I got an interesting phone call from a retailer... (alternate.de)
> 
> He said they have plenty of PG27UQ in stock - but ASUS blocked them to sell these, as they would need a firmware-update...
> 
> Also they had to write to people who already received their PG27UQ to send it back for the update-process - as it is impossible for endusers to update the firmwae themselves!
> 
> And it would take time until the new update is ready (its in developement right now!) and flashed onto the units!
> The alternate.de service told me they plan on selling the monitors again at the end of july - when the process is finished!
> They are waiting for the update from Asus and then have to flash firmware for every single unit!
> 
> And i also checked the site - the PG27UQ has vanished completely from their site -
> but hours ago it was there, in stock and sold for 2599€
> 
> So it seems, that Asus knows about some problems and some users were right when saying:
> early adopters are beta-testers!
> 
> Has anyone of you received an email to send your monitors back for updating the firmware??





well i'm glad mine won't ship before the 30th from velocitymicro.com


----------



## Glerox

Sichtwechsel86 said:


> I got an interesting phone call from a retailer... (alternate.de)
> 
> He said they have plenty of PG27UQ in stock - but ASUS blocked them to sell these, as they would need a firmware-update...
> 
> Also they had to write to people who already received their PG27UQ to send it back for the update-process - as it is impossible for endusers to update the firmwae themselves!
> 
> And it would take time until the new update is ready (its in developement right now!) and flashed onto the units!
> The alternate.de service told me they plan on selling the monitors again at the end of july - when the process is finished!
> They are waiting for the update from Asus and then have to flash firmware for every single unit!
> 
> And i also checked the site - the PG27UQ has vanished completely from their site -
> but hours ago it was there, in stock and sold for 2599€
> 
> So it seems, that Asus knows about some problems and some users were right when saying:
> early adopters are beta-testers!
> 
> Has anyone of you received an email to send your monitors back for updating the firmware??


European (90LM03A0-B01370) and American (90LM03A0-B013B0) models are not the same. I wonder if firmware is different because American retailers continue to sell it. Mine has shipped yesterday.


----------



## kot0005

Sichtwechsel86 said:


> I got an interesting phone call from a retailer... (alternate.de)
> 
> He said they have plenty of PG27UQ in stock - but ASUS blocked them to sell these, as they would need a firmware-update...
> 
> Also they had to write to people who already received their PG27UQ to send it back for the update-process - as it is impossible for endusers to update the firmwae themselves!
> 
> And it would take time until the new update is ready (its in developement right now!) and flashed onto the units!
> The alternate.de service told me they plan on selling the monitors again at the end of july - when the process is finished!
> They are waiting for the update from Asus and then have to flash firmware for every single unit!
> 
> And i also checked the site - the PG27UQ has vanished completely from their site -
> but hours ago it was there, in stock and sold for 2599€
> 
> So it seems, that Asus knows about some problems and some users were right when saying:
> early adopters are beta-testers!
> 
> Has anyone of you received an email to send your monitors back for updating the firmware??


No recall here..in Australia


----------



## Sichtwechsel86

Glerox said:


> European (90LM03A0-B01370) and American (90LM03A0-B013B0) models are not the same. I wonder if firmware is different because American retailers continue to sell it. Mine has shipped yesterday.


hm... 
i just thought language would be different...
but as the service man told me - asus is in the middle of working on this new firmware - and they have to wait for it - and until then no unit can leave their stock

also i don't think that features are different on both models...

question is: is europeans model more buggy than us model?
and what bugs or tweaks are they working on...

i don't know if sold models in the us already have these upcoming fw-tweaks implemented - no matter if they are different from their european counterparts...

and also: would asus us even tell customers about the firmware-flaws...?
sometimes i think - germany is a bit special when it comes to customer protection and so on...
maybe asus germany had no other choice than stopping selling buggy units, and offer fw-updates officially...
or they just did, because they know how complicated it can get for them law-wise, with many people complaining about faulty products...
or they just have learned their lesson after the PG279Q catastrophe, when hundreds of units had to be taken back because of backlightbleed

but that's just guessing of course...

but i'm kind of skeptical why they force to stop selling in-stock-units in germany for a fw-update - while in all other countries they even don't communicate that they are working on a fw-update!


----------



## tinykitten

Could be to fix the Light in Motion option affecting the Batman ROG light also, while the Batman light option having no effect whatsoever on the light it's supposed to control. Or maybe that was just a fault within my unit. Who knows at this point.


----------



## kot0005

Horraaaay we got another guy not using standoff's to mount his VESA arm and instead covers the holes...lul


----------



## Sichtwechsel86

kot0005 said:


> Horraaaay we got another guy not using standoff's to mount his VESA arm and instead covers the holes...lul
> 
> https://youtu.be/-abe56aSEzg


exactly my thought...


----------



## kot0005

These people ruining these monitors and complaining about being $$$, I mean if its so expensive you gotta take care of it right ?!? but no just treat it like a $200 monitor.


----------



## Ferreal

Monstieur said:


> I have multiple G-SYNC monitors like the PG278Q, PG27VQ, and PG258Q. I just don't have HDR + G-SYNC which is why I want the PG27UQ.
> 
> Is the HDR on the PG27UQ better than the LG OLED? Even if it's not, does HDR + G-SYNC make up for it? When I watch HDR movies on the OLED and see fine detail like the lightning on Thor being perfectly illuminated on OLED, I find it hard to imagine I'd be satisfied with the big FALD zones.


It was worth it for me. I have the Sony OLED for single player games and have been using my PG278Q for FPS/online multiplayer games. I now own the Acer X27, the HDR on this monitor takes it to a new level. So clear and bright. There are flaws, for example, SDR games don't look as good as the OLED because of black levels. I'm happy with it so far, I mostly play BF1 or Battlefront 2 and BF V when it comes out. I will still use my Sony OLED for some single player games, tv shows, and movies. 

You will need a beefy PC for it though. Atleast a 1080Ti and are willing to upgrade as soon as the next gen Nvidia GPU comes out. If not, just forget it about it.


----------



## Kommando Kodiak

kot is the vesa standoff in the box?


----------



## CallsignVega

Kommando Kodiak said:


> kot is the vesa standoff in the box?


Yes, I saw them.


----------



## Kommando Kodiak

*forehead wipe* whew, i was hoping they did mine comes today. I'm legit camping out front so the fedex guy doesnt chuck mine at the door. like that one vid from years back of the fedex guy throwing the tv over the fence in anger


----------



## Baasha

Is anyone running more than 2-Way SLI with this monitor? 

This monitor is SST correct? I am not sure why I'm getting this message in NVCP - it kills the scaling and ruins the performance. My 4K OLED monitor does not have this issue and scaling in 4-Way is near perfect in many games.


----------



## badjz

tinykitten said:


> Could be to fix the Light in Motion option affecting the Batman ROG light also, while the Batman light option having no effect whatsoever on the light it's supposed to control. Or maybe that was just a fault within my unit. Who knows at this point.


Yes mine also is not working. It lights up but fails to display anything on the wall/roof.

Anyone else have this issue?


----------



## kot0005

Kommando Kodiak said:


> kot is the vesa standoff in the box?


yes they look like motherboard mount but are a lot thicker and longer.


----------



## kot0005

Baasha said:


> Is anyone running more than 2-Way SLI with this monitor?
> 
> This monitor is SST correct? I am not sure why I'm getting this message in NVCP - it kills the scaling and ruins the performance. My 4K OLED monitor does not have this issue and scaling in 4-Way is near perfect in many games.


lol most people dont even run 2 way, let alone 4 way sli



badjz said:


> Yes mine also is not working. It lights up but fails to display anything on the wall/roof.
> 
> Anyone else have this issue?


did you adjust the knob under it ? its like a scroll wheel


----------



## jesyjames

kot0005 said:


> yes they look like motherboard mount but are a lot thicker and longer.


Uh, do you know where in the box it is? I am not sure mine came with one. I found the little bag of stand of screws, but I've gone through everything and do not see the mount.


----------



## badjz

kot0005 said:


> Baasha said:
> 
> 
> 
> Is anyone running more than 2-Way SLI with this monitor?
> 
> This monitor is SST correct? I am not sure why I'm getting this message in NVCP - it kills the scaling and ruins the performance. My 4K OLED monitor does not have this issue and scaling in 4-Way is near perfect in many games.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> lol most people dont even run 2 way, let alone 4 way sli
> 
> 
> 
> badjz said:
> 
> 
> 
> Yes mine also is not working. It lights up but fails to display anything on the wall/roof.
> 
> Anyone else have this issue?
> 
> Click to expand...
> 
> did you adjust the knob under it ? its like a scroll wheel
Click to expand...

Yep that did it! Thanks mate


----------



## kot0005

jesyjames said:


> Uh, do you know where in the box it is? I am not sure mine came with one. I found the little bag of stand of screws, but I've gone through everything and do not see the mount.


I never said mount lol..the stand off are what you use.. you screw them into the monitor and use a VESA mount on top..


----------



## kot0005

badjz said:


> Yep that did it! Thanks mate


I personally turned mine off haha...the red ruins by green/black build..


----------



## kot0005

I also sold my PG279Q, got a decent $600 AU for it.

I also made a God of war logo for the light in motion projection. Just use a .5mm thick and <4mm wide double sided tape .

some pics.. https://imgur.com/a/xUl1RDZ


----------



## badjz

kot0005 said:


> I also sold my PG279Q, got a decent $600 AU for it.
> 
> I also made a God of war logo for the light in motion projection. Just use a .5mm thick and <4mm wide double sided tape .


Can you share a photo of what this looks like please?


----------



## Blackvette94

Here are the pics of removing the anti glare on the pg27uq! By far the most difficult monitor or tv I have ever taken apart and I have done may 25-30 now. 

The anti glare came off after 3hrs of damp paper towels with distilled water. No glue residue left over, just a pristine clear glass screen 🙂

Benefits of this mod:

Significant clarity due to high ppi 4k at 27 inches
Significant increase in brightness
Contrast increase is substantial 
Blacks look liquid now and picture overall looks like looking out a window :0

You need special plastic tools to open the case without damaging it, the ag filter is so very fine and cut in the exact shape of the polarizer that you have to be very careful when removing it!

I would be glad to do this mod for others on the x27 and pg27uq but I won’t be doing it for free due to the high degree of difficulty. This mod makes this display look next gen and shame on Asus for at least not giving us an option to have glossy vs matte 😞

Shame on Vega and l88bastard for not believing me that I did this :’(

Now onto the pics:


----------



## l88bastar

Blackvette94 said:


> Here are the pics of removing the anti glare on the pg27uq! By far the most difficult monitor or tv I have ever taken apart and I have done may 25-30 now.
> 
> The anti glare came off after 3hrs of damp paper towels with distilled water. No glue residue left over, just a pristine clear glass screen 🙂
> 
> Benefits of this mod:
> 
> Significant clarity due to high ppi 4k at 27 inches
> Significant increase in brightness
> Contrast increase is substantial
> Blacks look liquid now and picture overall looks like looking out a window :0
> 
> You need special plastic tools to open the case without damaging it, the ag filter is so very fine and cut in the exact shape of the polarizer that you have to be very careful when removing it!
> 
> I would be glad to do this mod for others on the x27 and pg27uq but I won’t be doing it for free due to the high degree of difficulty. This mod makes this display look next gen and shame on Asus for at least not giving us an option to have glossy vs matte 😞
> 
> Shame on Vega and l88bastard for not believing me that I did this :’(
> 
> Now onto the pics:


GOOOOOD GAAAAWWWD MAN!!! Please PM me how much to do my X27


----------



## Blackvette94

l88bastar said:


> Blackvette94 said:
> 
> 
> 
> Here are the pics of removing the anti glare on the pg27uq! By far the most difficult monitor or tv I have ever taken apart and I have done may 25-30 now.
> 
> The anti glare came off after 3hrs of damp paper towels with distilled water. No glue residue left over, just a pristine clear glass screen 🙂
> 
> Benefits of this mod:
> 
> Significant clarity due to high ppi 4k at 27 inches
> Significant increase in brightness
> Contrast increase is substantial
> Blacks look liquid now and picture overall looks like looking out a window :0
> 
> You need special plastic tools to open the case without damaging it, the ag filter is so very fine and cut in the exact shape of the polarizer that you have to be very careful when removing it!
> 
> I would be glad to do this mod for others on the x27 and pg27uq but I won’t be doing it for free due to the high degree of difficulty. This mod makes this display look next gen and shame on Asus for at least not giving us an option to have glossy vs matte 😞
> 
> Shame on Vega and l88bastard for not believing me that I did this :’(
> 
> Now onto the pics:
> 
> 
> 
> GOOOOOD GAAAAWWWD MAN!!! Please PM me how much to do my X27
Click to expand...

You should have believed me 😛 

Sending PM 🙂


----------



## l88bastar




----------



## HyperMatrix

Blackvette94 said:


> Here are the pics of removing the anti glare on the pg27uq! By far the most difficult monitor or tv I have ever taken apart and I have done may 25-30 now.
> 
> The anti glare came off after 3hrs of damp paper towels with distilled water. No glue residue left over, just a pristine clear glass screen 🙂
> 
> Benefits of this mod:
> 
> Significant clarity due to high ppi 4k at 27 inches
> Significant increase in brightness
> Contrast increase is substantial
> Blacks look liquid now and picture overall looks like looking out a window :0
> 
> You need special plastic tools to open the case without damaging it, the ag filter is so very fine and cut in the exact shape of the polarizer that you have to be very careful when removing it!
> 
> I would be glad to do this mod for others on the x27 and pg27uq but I won’t be doing it for free due to the high degree of difficulty. This mod makes this display look next gen and shame on Asus for at least not giving us an option to have glossy vs matte 😞
> 
> Shame on Vega and l88bastard for not believing me that I did this :’(
> 
> Now onto the pics:



My God. That looks amazing. Glad to see it's doable. Mine gets here on Tuesday. Then I'll definitely be opening her up. As a side note, if you enjoy doing this, and you have a local seller available, consider selling directly on eBay as pre-glossed for like a $500 markup. You'd be surprised how many takers you'll have.


----------



## mmms

See this link will help you :-


https://imgur.com/gallery/ybZ7X


----------



## Titanmode

Can someone please post pics of the Blacklight bleed with the Fald turned off. I want to know if the QC is still **** like with the last 2 asus monitors. As well as pics with haloing with a black screen


----------



## kot0005

Blackvette94 said:


> Here are the pics of removing the anti glare on the pg27uq! By far the most difficult monitor or tv I have ever taken apart and I have done may 25-30 now.
> 
> The anti glare came off after 3hrs of damp paper towels with distilled water. No glue residue left over, just a pristine clear glass screen 🙂
> 
> Benefits of this mod:
> 
> Significant clarity due to high ppi 4k at 27 inches
> Significant increase in brightness
> Contrast increase is substantial
> Blacks look liquid now and picture overall looks like looking out a window :0
> 
> You need special plastic tools to open the case without damaging it, the ag filter is so very fine and cut in the exact shape of the polarizer that you have to be very careful when removing it!
> 
> I would be glad to do this mod for others on the x27 and pg27uq but I won’t be doing it for free due to the high degree of difficulty. This mod makes this display look next gen and shame on Asus for at least not giving us an option to have glossy vs matte 😞
> 
> Shame on Vega and l88bastard for not believing me that I did this :’(
> 
> Now onto the pics:


sheet, thanks for doin this!!




badjz said:


> Can you share a photo of what this looks like please?


cant post photos here, stupid website upgrade...


https://www.thingiverse.com/thing:2973855

OCN admin = Fail.


----------



## kot0005

Titanmode said:


> Can someone please post pics of the Blacklight bleed with the Fald turned off. I want to know if the QC is still **** like with the last 2 asus monitors. As well as pics with haloing with a black screen


No point in using the monitor with FALD off..

I am really starting to hate OCN admin..new site and it doesnt even auto combine my new post with my latest..




mmms said:


> See this link will help you :-
> 
> 
> https://imgur.com/gallery/ybZ7X


lol poor guy..puts so much effort into making a guide and all he gets is 3 toxic comments.,.


----------



## Vipu

Damn removing that coating seems like amazing thing to do, I really want to do it for my current monitor but seems like a bit too high risk of damaging expensive monitor since I have never done it.

My screen is viewsonic xg2703-gs, does the wet paper towel thing work for that too?
Also how do you make the paper wet but not wet enough to make water go everywhere and ruin the whole screen? Or is it sealed enough that the water cant get where it shouldnt?

Can lightbleed be fixed too while you open the screen? I have a bit but if I could fix that too at same time that would be nice.


----------



## kot0005

Vipu said:


> Damn removing that coating seems like amazing thing to do, I really want to do it for my current monitor but seems like a bit too high risk of damaging expensive monitor since I have never done it.
> 
> My screen is viewsonic xg2703-gs, does the wet paper towel thing work for that too?
> Also how do you make the paper wet but not wet enough to make water go everywhere and ruin the whole screen? Or is it sealed enough that the water cant get where it shouldnt?


use this with Distilkled water https://www.google.com.au/search?q=spray+bottle&ie=&oe=

spraythe paper towels
and lay them on the screen


----------



## Leopardi

Vipu said:


> Damn removing that coating seems like amazing thing to do, I really want to do it for my current monitor but seems like a bit too high risk of damaging expensive monitor since I have never done it.
> 
> My screen is viewsonic xg2703-gs, does the wet paper towel thing work for that too?
> Also how do you make the paper wet but not wet enough to make water go everywhere and ruin the whole screen? Or is it sealed enough that the water cant get where it shouldnt?
> 
> Can lightbleed be fixed too while you open the screen? I have a bit but if I could fix that too at same time that would be nice.







Just taking off the panel frame reduces lightbleed


----------



## Vipu

kot0005 said:


> use this with Distilkled water https://www.google.com.au/search?q=spray+bottle&ie=&oe=
> 
> spraythe paper towels
> and lay them on the screen


Have many people here done it to their screen and is the success rate high?
Im not sure if I heard about removing this coat before today but havent seen anyone talk about it before.


----------



## badjz

kot0005 said:


> Blackvette94 said:
> 
> 
> 
> Here are the pics of removing the anti glare on the pg27uq! By far the most difficult monitor or tv I have ever taken apart and I have done may 25-30 now.
> 
> The anti glare came off after 3hrs of damp paper towels with distilled water. No glue residue left over, just a pristine clear glass screen 🙂
> 
> Benefits of this mod:
> 
> Significant clarity due to high ppi 4k at 27 inches
> Significant increase in brightness
> Contrast increase is substantial
> Blacks look liquid now and picture overall looks like looking out a window :0
> 
> You need special plastic tools to open the case without damaging it, the ag filter is so very fine and cut in the exact shape of the polarizer that you have to be very careful when removing it!
> 
> I would be glad to do this mod for others on the x27 and pg27uq but I won’t be doing it for free due to the high degree of difficulty. This mod makes this display look next gen and shame on Asus for at least not giving us an option to have glossy vs matte 😞
> 
> Shame on Vega and l88bastard for not believing me that I did this :’(
> 
> Now onto the pics:
> 
> 
> 
> sheet, thanks for doin this!!
> 
> 
> 
> 
> badjz said:
> 
> 
> 
> Can you share a photo of what this looks like please?
> 
> Click to expand...
> 
> cant post photos here, stupid website upgrade...
> 
> 
> https://www.thingiverse.com/thing:2973855
> 
> OCN admin = Fail.
Click to expand...

Looks great. What material/paper did you use to make the silhouette?


----------



## kot0005

badjz said:


> Looks great. What material/paper did you use to make the silhouette?


3d printed it with abs, took like 40mins to print it.


----------



## HyperMatrix

New review up: https://www.pcper.com/reviews/Graph...144Hz-G-SYNC-Monitor-True-HDR-Arrives-Desktop

Important takeaways: These monitors use a $2600 Altera FPGA module. Not a typo. Check here: https://www.mouser.com/ProductDetai...oXUAKC9nfNM2xfTloEqocAIMvY8Nr6a3AA3td6MoL6w==

Also has 3GB ddr4 memory on board. 

Please go back to telling me how this $2k monitor with FALD, GSYNC, Quantum Dot, 4K, and 144Hz is overpriced again. Or how FreeSync/VRR is just as good as GSYNC which is overpriced only because Nvidia are greedy and AMD are white knights.


----------



## Blackvette94

HyperMatrix said:


> Blackvette94 said:
> 
> 
> 
> Here are the pics of removing the anti glare on the pg27uq! By far the most difficult monitor or tv I have ever taken apart and I have done may 25-30 now.
> 
> The anti glare came off after 3hrs of damp paper towels with distilled water. No glue residue left over, just a pristine clear glass screen 🙂
> 
> Benefits of this mod:
> 
> Significant clarity due to high ppi 4k at 27 inches
> Significant increase in brightness
> Contrast increase is substantial
> Blacks look liquid now and picture overall looks like looking out a window :0
> 
> You need special plastic tools to open the case without damaging it, the ag filter is so very fine and cut in the exact shape of the polarizer that you have to be very careful when removing it!
> 
> I would be glad to do this mod for others on the x27 and pg27uq but I won’t be doing it for free due to the high degree of difficulty. This mod makes this display look next gen and shame on Asus for at least not giving us an option to have glossy vs matte 😞
> 
> Shame on Vega and l88bastard for not believing me that I did this :’(
> 
> Now onto the pics:
> 
> 
> 
> 
> My God. That looks amazing. Glad to see it's doable. Mine gets here on Tuesday. Then I'll definitely be opening her up. As a side note, if you enjoy doing this, and you have a local seller available, consider selling directly on eBay as pre-glossed for like a $500 markup. You'd be surprised how many takers you'll have.
Click to expand...

Honestly, I would not charge someone $500 more for the monitor after doing this mod. I will say that to me, if I was unable to do this mod and being that I HATE anti glare with a passion, then I would probably paid $2500 for this monitor with no anti glare. It looks that much better without it!




Vipu said:


> Damn removing that coating seems like amazing thing to do, I really want to do it for my current monitor but seems like a bit too high risk of damaging expensive monitor since I have never done it.
> 
> My screen is viewsonic xg2703-gs, does the wet paper towel thing work for that too?
> Also how do you make the paper wet but not wet enough to make water go everywhere and ruin the whole screen? Or is it sealed enough that the water cant get where it shouldnt?
> 
> Can lightbleed be fixed too while you open the screen? I have a bit but if I could fix that too at same time that would be nice.



It is high risk if you have never done it. On a scale of 1-10 for difficulty, this is a 10 due to how it is all put together. 




Vipu said:


> kot0005 said:
> 
> 
> 
> use this with Distilkled water https://www.google.com.au/search?q=spray+bottle&ie=&oe=
> 
> spraythe paper towels
> and lay them on the screen
> 
> 
> 
> Have many people here done it to their screen and is the success rate high?
> Im not sure if I heard about removing this coat before today but havent seen anyone talk about it before.
Click to expand...

Here is the thing, most people make the mistake of putting too much water on it. The key is barely damp towels, you need ring out as much water as possible and keep it damp the whole time. Some displays need 6-7 hrs to remove it with no glue residue left behind, others like this pg27uq only need 3hrs. 

Of the 25-30 displays that I have removed the ag filter on, I have only ruined one display, it was a Sony 43 x720e tv. That actually hade the polarizer on the top and was bonded with an anti glare. Now I can look at a display and know if that is the case without even taking it apart. You can tell by when light hits the tv it creates a rainbow effect.

Overall this mod makes this display look like a reference studio monitor now. The difference is dramatic.

Below you can see there is no orange peel or texture on the screen that some displays have after removing the ag filter. This was a pleasant surprise because some displays I have remove the ag filter on it have that lg oled texture look on the screen that you can see when the display is off and your looking at the reflection. The pg27uq does not have that so it makes it look even more high end when it is off! Also no glue residue so that means the screen is not sicky and you can still wipe it down with a micro fiber and not worry 

Like I said before, if people would like me to do this to their display, I would be happy to. Pay for shipping to me and back and I will charge $100 to do it.PM me if you would like to do that.

Pic below was taken this morning, behind the pg27uq is the samsung 65 q9. This monitor now looks like a mini q9! Except it has much better viewing angles on the monitor due the ips (ahva) display and becuase the monitor has fald, I see no ips glow looking at the display at an angle which is amazing. This monitor now with the ag filter removed reminds me of my old Sony fw900, which is a good thing!!


----------



## CallsignVega

My trick is to cover the damp paper towels with saran wrap, that way the water doesn't evaporate. Works great. 

Interesting about 3 hours and no glue residue left behind. They must have changed the glue formula.

Also interesting about the quality of the components that went into the new HDR G-sync module. That is some serious hardware for a monitor and makes the price not seem so crazy anymore. But everyone wants everything on the cheap these days...

And the ASUS having two fans, probably why it was louder next to the Acer having a single larger fan in my side-by-side comparison.


----------



## Blackvette94

CallsignVega said:


> My trick is to cover the damp paper towels with seran wrap, that way the water doesn't evaporate. Works great.
> 
> Interesting about 3 hours and no glue residue left behind. They must have changed the glue formula.


They must have, first display I have done that only took 3hrs of damp towels. You are going to be in love with this display with the ag filter removed! Before I removed it I was not blown away with the monitor due to the ag filter. I could never be blown away from a display that has this over a pristine glass screen:


----------



## Vipu

Thanks for all the answers, still 1 a bit offtopic question.
How similar screens usually are, can I assume almost every screen have this coating put same way and removed same way?
I want to try do it first for my 2nd screen and then decide if I want to try it on my main screen, my 2nd screen is LG W2600HP-BF
Also how do you know how long is enough before you try to remove it so the glue doesnt stay? What if some of it stays, just clean with water?
And lastly, how do you clean the screen after removing coating? Just like normally, with barely damp cloth?


----------



## kot0005

https://www.pcper.com/reviews/Graph...z-G-SYNC-Monitor-True-HDR-Arrives-Desktop/Tea

*** these monitors have 3gb DDR4 2400Mhz RAM


----------



## Glerox

Blackvette94 said:


> Here are the pics of removing the anti glare on the pg27uq! By far the most difficult monitor or tv I have ever taken apart and I have done may 25-30 now.
> 
> The anti glare came off after 3hrs of damp paper towels with distilled water. No glue residue left over, just a pristine clear glass screen 🙂
> 
> Benefits of this mod:
> 
> Significant clarity due to high ppi 4k at 27 inches
> Significant increase in brightness
> Contrast increase is substantial
> Blacks look liquid now and picture overall looks like looking out a window :0
> 
> You need special plastic tools to open the case without damaging it, the ag filter is so very fine and cut in the exact shape of the polarizer that you have to be very careful when removing it!
> 
> I would be glad to do this mod for others on the x27 and pg27uq but I won’t be doing it for free due to the high degree of difficulty. This mod makes this display look next gen and shame on Asus for at least not giving us an option to have glossy vs matte 😞
> 
> Shame on Vega and l88bastard for not believing me that I did this :’(
> 
> Now onto the pics:


It looks awesome! I would like to do it but dont have the courage to cancel a 3 year warranty :S


----------



## ToTheSun!

HyperMatrix said:


> New review up: https://www.pcper.com/reviews/Graph...144Hz-G-SYNC-Monitor-True-HDR-Arrives-Desktop
> 
> Important takeaways: These monitors use a $2600 Altera FPGA module. Not a typo. Check here: https://www.mouser.com/ProductDetai...oXUAKC9nfNM2xfTloEqocAIMvY8Nr6a3AA3td6MoL6w==
> 
> Also has 3GB ddr4 memory on board.
> 
> Please go back to telling me how this $2k monitor with FALD, GSYNC, Quantum Dot, 4K, and 144Hz is overpriced again. Or how FreeSync/VRR is just as good as GSYNC which is overpriced only because Nvidia are greedy and AMD are white knights.


It is true that they must have bought them at a lower price, bulk and all, but it is still surprising to me. I didn't expect the bulk of the cost to be in that single component. The price makes much more sense now. It also says something about the possibility of the tech's cost decreasing in future models, which is worrying for the slightly more disposable-income-challenged enthusiasts.


----------



## zhazha

I get mine 2 days ago, and have used lg oled for more than a year. So far my feeling is the fald panel is better then oled for hdr when it comes to the bright part of image, for darker parts, though pg27uq is way better than any LCD I have seen, it still can't match oled at all. Since the difference of details in the dark is less eye appearing to me then the brightness when gaming, I would prefer gaming on the lcd panel, and watch netflix on my lg oled.


----------



## Baasha

Can anyone confirm whether this monitor is SST or MST? 

I was damn sure it was SST since it does 4K @ 144Hz with one DP 1.4 cable but I'm not sure why I'm getting the NVCP warning message. 

PLEASE HELP!


----------



## Exilon

What message?


----------



## CallsignVega

These monitors are single stream transport over a single DP 1.4 connection that is maxed out. SLI has nothing to do with how the monitor communicates to the GPU, so not really sure what the question is?


----------



## Baasha

Exilon said:


> What message?





CallsignVega said:


> These monitors are single stream transport over a single DP 1.4 connection that is maxed out. SLI has nothing to do with how the monitor communicates to the GPU, so not really sure what the question is?


Ever since Pascal came out Nvidia gimped SLI beyond 2-Way. To get around that hurdle, you can tweak the settings (driver/inspector) and use another SLI bridge to make 3 or 4 way SLI work (with Pascal GPUs). I've been running 4-Way on Pascal since Aug. 2016 on both Titan X, Titan Xp, and 1080 Ti.

If NVCP shows the message (see pic) "A higher performance SLI bridge can improve your experience," it means that either the monitor is MST or something else is wrong - this prevents 3 or 4 Way SLI to work well in games.

The 4K OLED monitor or any other SST monitor I've used (Dell P2715Q or even the 1440P 144Hz RoG Swift PG278Q) do NOT have this message in NVCP and so scaling in 3 or 4 way SLI works really well.

There is some setting in this monitor (my hunch is the HDR module ?) that's causing this to happen if it's an SST monitor. There should be no reason 3 or 4 way SLI shouldn't work well (see pics).

PG27UQ - NVCP MESSAGE









P2715Q - NO NVCP MESSAGE









Scaling in 4 Way SLI:


----------



## Aussiejuggalo

So... how many of you are actually going to buy these things?


----------



## CallsignVega

Baasha said:


> Ever since Pascal came out Nvidia gimped SLI beyond 2-Way. To get around that hurdle, you can tweak the settings (driver/inspector) and use another SLI bridge to make 3 or 4 way SLI work (with Pascal GPUs). I've been running 4-Way on Pascal since Aug. 2016 on both Titan X, Titan Xp, and 1080 Ti.
> 
> If NVCP shows the message (see pic) "A higher performance SLI bridge can improve your experience," it means that either the monitor is MST or something else is wrong - this prevents 3 or 4 Way SLI to work well in games.
> 
> The 4K OLED monitor or any other SST monitor I've used (Dell P2715Q or even the 1440P 144Hz RoG Swift PG278Q) do NOT have this message in NVCP and so scaling in 3 or 4 way SLI works really well.
> 
> There is some setting in this monitor (my hunch is the HDR module ?) that's causing this to happen if it's an SST monitor. There should be no reason 3 or 4 way SLI shouldn't work well (see pics).


It's because the high bandwidth SLI bridges are only 2 way SLI. You are running the old/slower bridges in 3/4 way SLI that has to send more information over the PCI-E bus. Not to mention you are limited to the slowest cards PCI-E bus speed, which is 8x 3.0 in a 4-way setup. You are literally double gimped. Running 4K at 144 Hz is 1,195 Million pixels per second, the other monitors you mention are only 530 Million pixels per second. 

My suggestion is to get a 2-way high bandwidth SLI bridge and run two cards in their own 16x PCI-E slots for the optimal experience. 3/4 way SLI is virtually worthless on a 4K 144 Hz monitor with G-Sync in todays games.


----------



## Baasha

CallsignVega said:


> It's because the high bandwidth SLI bridges are only 2 way SLI. You are running the old/slower bridges in 3/4 way SLI that has to send more information over the PCI-E bus. Not to mention you are limited to the slowest cards PCI-E bus speed, which is 8x 3.0 in a 4-way setup. You are literally double gimped. Running 4K at 144 Hz is 1,195 Million pixels per second, the other monitors you mention are only 530 Million pixels per second.
> 
> My suggestion is to get a 2-way high bandwidth SLI bridge and run two cards in their own 16x PCI-E slots for the optimal experience. *3/4 way SLI is virtually worthless on a 4K 144 Hz monitor with G-Sync in todays games.*


Yes, this is what I'm seeing as well after playing around with the settings for the past couple of days.

Sigh... the only issue is even 2 cards don't seem to hit 144Hz in 4K in most games!

GPU usage is only around 70 - 80% with both GPUs and the FPS is usually ~ 100 - 120. Shouldn't the two cards be pegged at 99% constantly since 144hz is quite hard to hit at 4K?


----------



## kot0005

I made a cyberpunk template for Light in motion, the 3d print failed because the text was waay too small, so I just printed it on a normal printer using adhesive film and here it is..

https://imgur.com/a/9hYONXp


----------



## CallsignVega

No could be a ton of things. CPU bottlenecks, crappy SLI scaling/profiles, drivers. There are always multiple inefficiencies with SLI. 

One reason I went with a Titan V and overclocked it hard. Single GPU fire and forget and utter smoothness. Granted two Xp's in SLI if the profile is working correctly may give 4K 120 FPS, but I'd wager the smoothness of a single GPU giving 4K 90 FPS makes up for a lot of it. On top of it, if you see GPU usage on multiple cards, it doesn't actually mean you are getting great SLI scaling or even higher FPS at all. You have to go into each game with SLI active and without and crunch the numbers.


----------



## Glerox

Baasha said:


> Yes, this is what I'm seeing as well after playing around with the settings for the past couple of days.
> 
> Sigh... the only issue is even 2 cards don't seem to hit 144Hz in 4K in most games!
> 
> GPU usage is only around 70 - 80% with both GPUs and the FPS is usually ~ 100 - 120. Shouldn't the two cards be pegged at 99% constantly since 144hz is quite hard to hit at 4K?


Sorry Baasha you have no choice but to buy two Titan Vs with HB bridge


----------



## Baasha

CallsignVega said:


> No could be a ton of things. CPU bottlenecks, crappy SLI scaling/profiles, drivers. There are always multiple inefficiencies with SLI.
> 
> One reason I went with a Titan V and overclocked it hard. Single GPU fire and forget and utter smoothness. Granted two Xp's in SLI if the profile is working correctly may give 4K 120 FPS, but I'd wager the smoothness of a single GPU giving 4K 90 FPS makes up for a lot of it. On top of it, if you see GPU usage on multiple cards, it doesn't actually mean you are getting great SLI scaling or even higher FPS at all. You have to go into each game with SLI active and without and crunch the numbers.


So far the gameplay is very smooth with 2 GPUs but the OSD is almost a curse since it bothers me that it doesn't show 99% usage and I get < 144fps. Frame times are very good however (~ 5ms).

Yea I tried the Titan V for a few months but couldn't bear the low FPS since I was used to ~ 150+fps maxed out in 4K in almost every game where SLI worked. Playing at sub-100fps with a 144Hz monitor seems to defeat the purpose though.



Glerox said:


> Sorry Baasha you have no choice but to buy two Titan Vs with HB bridge


Can't SLI Titan V. Had one and sold it - DX12 games were amazing with the Volta architecture - Wolfenstein especially was fantastic.

I guess SLI is really in its death throes. Sigh... it was a good 8 years of 4 Way SLI though.


----------



## l88bastar

Baasha said:


> So far the gameplay is very smooth with 2 GPUs but the OSD is almost a curse since it bothers me that it doesn't show 99% usage and I get < 144fps. Frame times are very good however (~ 5ms).
> 
> Yea I tried the Titan V for a few months but couldn't bear the low FPS since I was used to ~ 150+fps maxed out in 4K in almost every game where SLI worked. Playing at sub-100fps with a 144Hz monitor seems to defeat the purpose though.
> 
> 
> 
> Can't SLI Titan V. Had one and sold it - DX12 games were amazing with the Volta architecture - Wolfenstein especially was fantastic.
> 
> I guess SLI is really in its death throes. Sigh... it was a good 8 years of 4 Way SLI though.


Quadfire Battlefield 3 on a 5x1 surround setup was the epoch of multi-gpu for me....was a ton of fun to build such a beast system, but also a ton of headaches too.... as I have gotten older I find that I appreciate simpler setups that don't give me any hassle and let me spend precious time on actual gaming and pronatizing!


----------



## CallsignVega

Ya after doing about a dozen crazy setups in my life I no longer have that patience anymore. I'm lazier now as I've gotten older haha, just give me the fastest single GPU and the best gaming monitor in the world and I'm satiated these days!

Now only if I could get my hands on the leather jacket CEO edition...


----------



## MiniZaid

Blackvette94 said:


> Here are the pics of removing the anti glare on the pg27uq! By far the most difficult monitor or tv I have ever taken apart and I have done may 25-30 now.
> 
> The anti glare came off after 3hrs of damp paper towels with distilled water. No glue residue left over, just a pristine clear glass screen 🙂
> 
> Benefits of this mod:
> 
> Significant clarity due to high ppi 4k at 27 inches
> Significant increase in brightness
> Contrast increase is substantial
> Blacks look liquid now and picture overall looks like looking out a window :0
> 
> You need special plastic tools to open the case without damaging it, the ag filter is so very fine and cut in the exact shape of the polarizer that you have to be very careful when removing it!
> 
> I would be glad to do this mod for others on the x27 and pg27uq but I won’t be doing it for free due to the high degree of difficulty. This mod makes this display look next gen and shame on Asus for at least not giving us an option to have glossy vs matte 😞
> 
> Shame on Vega and l88bastard for not believing me that I did this :’(
> 
> Now onto the pics:


If you have any lighting in the room, wouldn't you see a reflection? I feel like that's a bit annoying
and this monitor using the half way solution or is the film really strong?


----------



## kot0005

CallsignVega said:


> Ya after doing about a dozen crazy setups in my life I no longer have that patience anymore. I'm lazier now as I've gotten older haha, just give me the fastest single GPU and the best gaming monitor in the world and I'm satiated these days!
> 
> Now only if I could get my hands on the leather jacket CEO edition...


lol


----------



## indstri

So I have a PG27UQ incoming as well. Any verdict vs the X27 besides Vega's take? Not sure how many folks would be in position to compare the 2 or if there was ever anything to come out of the cert the ASUS panel has vs the ACER.


----------



## sblantipodi

what about the fan noise? Is it really audible?


----------



## bee144

Baasha said:


> CallsignVega said:
> 
> 
> 
> It's because the high bandwidth SLI bridges are only 2 way SLI. You are running the old/slower bridges in 3/4 way SLI that has to send more information over the PCI-E bus. Not to mention you are limited to the slowest cards PCI-E bus speed, which is 8x 3.0 in a 4-way setup. You are literally double gimped. Running 4K at 144 Hz is 1,195 Million pixels per second, the other monitors you mention are only 530 Million pixels per second.
> 
> My suggestion is to get a 2-way high bandwidth SLI bridge and run two cards in their own 16x PCI-E slots for the optimal experience. *3/4 way SLI is virtually worthless on a 4K 144 Hz monitor with G-Sync in todays games.*
> 
> 
> 
> Yes, this is what I'm seeing as well after playing around with the settings for the past couple of days.
> 
> Sigh... the only issue is even 2 cards don't seem to hit 144Hz in 4K in most games!
> 
> GPU usage is only around 70 - 80% with both GPUs and the FPS is usually ~ 100 - 120. Shouldn't the two cards be pegged at 99% constantly since 144hz is quite hard to hit at 4K?
Click to expand...

My dual Titan X (Pascals) would get bottlenecks at 65-75% each when playing BF1 on 2560x1440p (using Hb bridge). I was using my 4960X OC to 4.7GHz and only getting ~100 FPS.

I just upgraded to a 8086k and at stock clocks with everything else the same, my GPUs are never going below 92% usage (each). Frame rates are now 135-144 FPS.

Thus it appears I was hitting a serious cpu bottleneck.


----------



## Baasha

l88bastar said:


> Quadfire Battlefield 3 on a 5x1 surround setup was the epoch of multi-gpu for me....was a ton of fun to build such a beast system, but also a ton of headaches too.... as I have gotten older I find that I appreciate simpler setups that don't give me any hassle and let me spend precious time on actual gaming and pronatizing!





CallsignVega said:


> Ya after doing about a dozen crazy setups in my life I no longer have that patience anymore. I'm lazier now as I've gotten older haha, just give me the fastest single GPU and the best gaming monitor in the world and I'm satiated these days!
> 
> Now only if I could get my hands on the leather jacket CEO edition...


Yup - the Titan V experience was amazing in that it's just 'plug-and-play.' No bs tweaking the settings, optimizations, black screens, etc.

Are you guys getting a silver screen when exiting certain HDR-enabled games? I'll try and take a screenshot next time but for certain games when I exit, the screen just turns silver about 3/4ths horizontally and I have to 'sign out' and log back in to get rid of it.


----------



## Baasha

bee144 said:


> My dual Titan X (Pascals) would get bottlenecks at 65-75% each when playing BF1 on 2560x1440p (using Hb bridge). I was using my 4960X OC to 4.7GHz and only getting ~100 FPS.
> 
> I just upgraded to a 8086k and at stock clocks with everything else the same, my GPUs are never going below 92% usage (each). Frame rates are now 135-144 FPS.
> 
> Thus it appears I was hitting a serious cpu bottleneck.


Interesting. I have the 6950X @ 4.30Ghz and so I'm pretty sure 2 GPUs are not being bottlenecked (?).


----------



## HyperMatrix

Baasha said:


> Interesting. I have the 6950X @ 4.30Ghz and so I'm pretty sure 2 GPUs are not being bottlenecked (?).


Depends on the game. I have a 6950x at 4.375GHz and while it's more than enough for good multi-threaded games, it won't be able to match a 5GHz+ 4/6 core CPU from a more modern generation for anything that's a bit more single threaded in design. (ie. DX11). Only way to get past DX11 draw call limitations is very high clock rates on a few cores.


----------



## moonbogg

My old 980Ti SLI setup got around 90ish FPS in BF1. My single 1080Ti gets between 110-135ish. I think SLI just sucks. I'll never go back to it and I ran with SLI since Nvidia started using it with the Geforce 6 series. Not the 600 series, but the 6 series. I'm kind of old. SLI is dead IMO and offers few advantages.


----------



## Leopardi

moonbogg said:


> My old 980Ti SLI setup got around 90ish FPS in BF1. My single 1080Ti gets between 110-135ish. I think SLI just sucks. I'll never go back to it and I ran with SLI since Nvidia started using it with the Geforce 6 series. Not the 600 series, but the 6 series. I'm kind of old. SLI is dead IMO and offers few advantages.


I never saw the micro-stutter, driver issues, and lacking support for games appealing.


----------



## l88bastar

Leopardi said:


> I never saw the micro-stutter, driver issues, and lacking support for games appealing.


I totally agree...however, look how awesome my old 5x1 portrait + 7970 Quadfire Lightning setup looked!
Forget about practical use....just look how marvelous that looked!!!


----------



## Glerox

Can somebody share the driver and .ICM profile on the PG27UQ cd? Don't have a CD player and can't find it on ASUS website.


----------



## kot0005

Glerox said:


> Can somebody share the driver and .ICM profile on the PG27UQ cd? Don't have a CD player and can't find it on ASUS website.


cant upload , complain to OCN admin. I tried the settings and they dont change anything visually..may be not working for me.


----------



## kot0005

https://drive.google.com/open?id=1VftUSZpEE4R0wNNQ1igCZglzsDc4xM6g


----------



## Glerox

kot0005 said:


> https://drive.google.com/open?id=1VftUSZpEE4R0wNNQ1igCZglzsDc4xM6g


Thanks!!! will try it tuesday on D-day


----------



## jesyjames

Sorry if this has been discussed, haven't seen any mention of it except for Lim's Cave review where he briefly touched on the black crush issue in his x27 review. If you are running 4k 144hz(non-hdr), the monitor switches to YCbCr422 mode because of bandwidth restrictions. Also hugely significant: because it's no no longer operating in RGB mode, the output dynamic range switches to limited thus severely crushing blacks. I noticed in battlegrounds that I could barely see inside of buildings. I then loaded up some test patterns and sure enough the black crush is real and basically makes the monitor useless in 144hz mode. Is there any work around for this?


----------



## Sichtwechsel86

jesyjames said:


> Sorry if this has been discussed, haven't seen any mention of it except for Lim's Cave review where he briefly touched on the black crush issue in his x27 review. If you are running 4k 144hz(non-hdr), the monitor switches to YCbCr422 mode because of bandwidth restrictions. Also hugely significant: because it's no no longer operating in RGB mode, the output dynamic range switches to limited thus severely crushing blacks. I noticed in battlegrounds that I could barely see inside of buildings. I then loaded up some test patterns and sure enough the black crush is real and basically makes the monitor useless in 144hz mode. Is there any work around for this?


maybe the upcoming firmware-update will fix this...

in germany and in spain Asus informed their retailers to stop selling PG27UQ and it is even possible that some customers receive an email or something to send it back - 
so that Asus can update the firmware on the units!

As far as i know - and some other users over in the asus forum confirmed it - 
Asus is working on that FW-update right now and it is not only a fix for aura sync, but also something related to image-quality (i don't have more specific infos)

Other infos say, that Asus is inspecting it's current FW and trying to figure out if the issues only effect SOME or ALL models...

for germany i can confirm that the biggest sellers where informed to DON'T send out its stock-units and wait for FW-update!

Asus itself has NOT officially communicated whats going on - 
but for every owner i recommend to check out Asus service and ask for further information on when and how the new FW will roll out!

Potential owners on the edge of ordering i would advice to wait until it's clarified...
to not risk getting a model with older FW and then have to return it for updating process...


----------



## bee144

So when playing BF1, I have encountered a strange issue. With HDR off, I hover right around 98 FPS (matching the 98hz) and both GPUs are nearly maxed out. The second I turn on HDR in the BF1 control panel, my FPS drop to 70 FPS (as if it’s being limited) and my GPU usage drops to ~75 FPS on both cards.

Any idea why this is happening?


----------



## Monstieur

You should really only run this monitor in 8-bit RGB mode even for HDR.


----------



## bee144

Really? I’ll have to try that as I had 10 bit selected in Nvidia control panel


----------



## profundido

Baasha said:


> Yes, this is what I'm seeing as well after playing around with the settings for the past couple of days.
> 
> Sigh... the only issue is even 2 cards don't seem to hit 144Hz in 4K in most games!
> 
> GPU usage is only around 70 - 80% with both GPUs and the FPS is usually ~ 100 - 120. Shouldn't the two cards be pegged at 99% constantly since 144hz is quite hard to hit at 4K?


No, the process of SLI involves an algorithm that severely bottlenecks the cpu. I second everything the others replied and just like them and you I ran into the limitations of this generation's bad SLI support (driver/game wise). I used to have a [email protected] setup too and saw great increase when replacing it with a [email protected] + [email protected] setup. Sad as it may seem singlethread core clock is still king in 2018 for most content and the in the newest optimized multicore content it doesn't matter since no game bottlenecks all 8 hyperthreaded @5+ghz on a 4-core cpu anyway.

This cpu's ring architecture (as opposed to mesh) combined with high clock and low latency does miracles to the fps of both old and new games. The fact that I lost 2*16xPCIE which became 2*8xPCIE on the new board made zero negative impact as demonstrated last year in Steve Burke's SLI scaling research.

and by the way: in the picture here you still see my 2-way Titan xp SLI. By now I have removed 1 and I'm running off 1 and 1 card only from now on. Not only did the overhead cause low fps gain (and even loss!!) in most games but in addition the microstuttering made the game look so much less fluent than with 1 card. I will reuse the second titan in another pc and I run 1 card max OC since the end of last year. SLI is officially on it's dying breath...we must accept it


----------



## HyperMatrix

Sichtwechsel86 said:


> maybe the upcoming firmware-update will fix this...
> 
> in germany and in spain Asus informed their retailers to stop selling PG27UQ and it is even possible that some customers receive an email or something to send it back -
> so that Asus can update the firmware on the units!
> 
> As far as i know - and some other users over in the asus forum confirmed it -
> Asus is working on that FW-update right now and it is not only a fix for aura sync, but also something related to image-quality (i don't have more specific infos)
> 
> Other infos say, that Asus is inspecting it's current FW and trying to figure out if the issues only effect SOME or ALL models...
> 
> for germany i can confirm that the biggest sellers where informed to DON'T send out its stock-units and wait for FW-update!
> 
> Asus itself has NOT officially communicated whats going on -
> but for every owner i recommend to check out Asus service and ask for further information on when and how the new FW will roll out!
> 
> Potential owners on the edge of ordering i would advice to wait until it's clarified...
> to not risk getting a model with older FW and then have to return it for updating process...


I'd rather have one to use while Asus figures out if a replacement is needed. Then simply swap in store if you bought locally, or use Asus's rapid replacement warranty program that covers express shipping both ways. There will be limited availability of this monitor. Both Newegg and BestBuy here in Canada went from Preorder status to backorder/unavailable.


----------



## Glerox

Monstieur said:


> You should really only run this monitor in 8-bit RGB mode even for HDR.


When you turn HDR ON in Windows, it automatically changes to 10bit so i don't understand or you could use HDR with 8bit RGB.


----------



## profundido

Glerox said:


> When you turn HDR ON in Windows, it automatically changes to 10bit so i don't understand or you could use HDR with 8bit RGB.


First select "Let Nvidia control settings" and then select 120hz (not 144!!) 8-bit RGB Full settings and finally enable HDR in windows. It will not change to 10bit or change at all


----------



## Glerox

HyperMatrix said:


> I'd rather have one to use while Asus figures out if a replacement is needed. Then simply swap in store if you bought locally, or use Asus's rapid replacement warranty program that covers express shipping both ways. There will be limited availability of this monitor. Both Newegg and BestBuy here in Canada went from Preorder status to backorder/unavailable.


Do you know if the PG27UQ is covered by that rapid replacement warranty program?


----------



## Glerox

jesyjames said:


> Sorry if this has been discussed, haven't seen any mention of it except for Lim's Cave review where he briefly touched on the black crush issue in his x27 review. If you are running 4k 144hz(non-hdr), the monitor switches to YCbCr422 mode because of bandwidth restrictions. Also hugely significant: because it's no no longer operating in RGB mode, the output dynamic range switches to limited thus severely crushing blacks. I noticed in battlegrounds that I could barely see inside of buildings. I then loaded up some test patterns and sure enough the black crush is real and basically makes the monitor useless in 144hz mode. Is there any work around for this?


I agree that it's funny you can play in HDR in 144Hz without crushing blacks but not in SDR...
I want to play pubg at 144hz!!!


----------



## CallsignVega

Ya that is one of the downsides that I noticed on both of these monitors. 144 Hz can really only be used in HDR mode as in SDR mode "limited" dynamic range clips blacks. Use 120 Hz for SDR.


----------



## Monstieur

Black crush will not occur if everything is running correctly. YCbCr is always "Limited Range", but in 10-bit mode it's 64-960 and will not cause increased banding like in 8-bit mode. In 10-bit YCbCr mode the display should treat 64 as black and 960 as white. If there is black crush, there's a problem somewhere in the chain.


----------



## Chocobo

27" is cool, 34" is just too big, get a TV instead.

DPI is the next thing on monitors, so increasing the screen dimension wouldn't make any sense at all.

What would you do with a 8K monitor? increase it to 68"? Makes no sense at all, right?


----------



## CallsignVega

Monstieur said:


> Black crush will not occur if everything is running correctly. YCbCr is always "Limited Range", but in 10-bit mode it's 64-960 and will not cause increased banding like in 8-bit mode. In 10-bit YCbCr mode the display should treat 64 as black and 960 as white. If there is black crush, there's a problem somewhere in the chain.


In SDR in any YCbCr mode which are all "limited", these monitors do not auto adjust the black levels like TV's do. Doesn't matter if the color is set to 8 or 10 bit. Only turning on HDR fixes the problem. Any RGB "Full" setting works fine/normal for both SDR and HDR content.


----------



## Sichtwechsel86

Glerox said:


> When you turn HDR ON in Windows, it automatically changes to 10bit so i don't understand or you could use HDR with 8bit RGB.


no it doesn't...

it changes colorspace from rec709/sRGB to rec2020 - 
but bit-depth is as you set in the nvidia-control panel!

if having set 120hz and 8bit @ RGB/4:4:4 chroma in NCP, 
and then turn on HDR via Windows - it doesn't automatically use 10bit colors...

the only downside so far:
if watching UHD movies, the 10bit 4:2:2 colors of the H265 codec are reduced and adjusted to meet the 8bit range, 
so you will loose colorshades

if that is clearly visible i don't know...

but during my test - i could definitly see a difference between 4K 24hz 10bit RGB HDR and 4K 24hz 8bit RGB HDR - using good HDR samples like 'The Revenant', 'Sony Campf Fire Demo', etc...

so i recommend to just use 4K 120hz 8Bit RGB SDR and HDR for games...
and 4K 24hz 8Bit RGB SDR (for movies) and 4K 24hz 10bit RGB HDR (for UHD movies)


----------



## Sichtwechsel86

CallsignVega said:


> In SDR in any YCbCr mode which are all "limited", these monitors do not auto adjust the black levels like TV's do. Doesn't matter if the color is set to 8 or 10 bit. Only turning on HDR fixes the problem. Any RGB "Full" setting works fine/normal for both SDR and HDR content.


what i don't get is, how blackcrushing with 'limited range' is even possible...

Full means 0-255
limited 16-235

so if source and display are both limited or both Full - everything is fine...

if source is Full and display is limited - you get crush
if source is limited and display is full - you get the typical washed out look 

could it be, that the GPU-driver instead of outputting limited range is outputting Full range - 
and the display is expecting limited range - that would explain black-crush...

I don't get, how the display is handling limited signals to show black-crush... 

sorry for asking - i just don't understand the background of this problem...


----------



## Monstieur

Sichtwechsel86 said:


> the only downside so far:
> if watching UHD movies, the 10bit 4:2:2 colors of the H265 codec are reduced and adjusted to meet the 8bit range,
> so you will loose colorshades
> 
> if that is clearly visible i don't know...
> 
> but during my test - i could definitly see a difference between 4K 24hz 10bit RGB HDR and 4K 24hz 8bit RGB HDR
> 
> so i recommend to just use 4K 120hz 8Bit RGB SDR and HDR for games...
> and 4K 24hz 8Bit RGB SDR (for movies) and 4K 24hz 10bit RGB HDR (for UHD movies)


The colour will degrade if the game / application does not render to a 10-bit surface. I suspect that some software will only render to a 8-bit surface if the display is in 8-bit mode.

You can either render to a 10-bit surface and let the Nvidia driver perform dithering to 8-bit, or use a player (MPC-HC + madVR) that renders to 8-bit but performs its own dithering. madVR dithering with 8-bit output actually looks better than a native 10-bit output on most displays.

If you output 8-bit without dithering, it will cause banding.


----------



## HyperMatrix

Glerox said:


> Do you know if the PG27UQ is covered by that rapid replacement warranty program?


"ASUS Rapid Replacement 3 Year warranty with 1 year ZBD (zero bright dots) warranty and 2-way free shipping" according to Newegg.


----------



## kot0005

wow so apparently we have the worst quantum dot implementation.. I cant wait for emissive qdots with microled's..

good video to watch 




according to the guy microLED"s are not even close because of defect%


----------



## CallsignVega

Sichtwechsel86 said:


> what i don't get is, how blackcrushing with 'limited range' is even possible...
> 
> Full means 0-255
> limited 16-235
> 
> so if source and display are both limited or both Full - everything is fine...
> 
> if source is Full and display is limited - you get crush
> if source is limited and display is full - you get the typical washed out look
> 
> could it be, that the GPU-driver instead of outputting limited range is outputting Full range -
> and the display is expecting limited range - that would explain black-crush...
> 
> I don't get, how the display is handling limited signals to show black-crush...
> 
> sorry for asking - i just don't understand the background of this problem...




Yes a TV has a setting called "black level" like on my OLED. It senses the dynamic range in "auto" mode and displays blacks appropriately. You can also force it with the "low" or "high" settings. More info:

https://www.reddit.com/r/OLED/comments/7oah8d/lg_oled_b7_black_level/

For whatever reason, these monitors or through the drivers are not auto sensing the signal in SDR. HDR works fine though.

For SDR, the only solution I've found is to run 120 Hz RGB Full 8-bit.


----------



## bee144

profundido said:


> First select "Let Nvidia control settings" and then select 120hz (not 144!!) 8-bit RGB Full settings and finally enable HDR in windows. It will not change to 10bit or change at all


Thank you, this worked.

Next question, my monitor is currently in racing mode. How do I change it to another profile?


----------



## Sichtwechsel86

CallsignVega said:


> Yes a TV has a setting called "black level" like on my OLED. It senses the dynamic range in "auto" mode and displays blacks appropriately. You can also force it with the "low" or "high" settings. More info:
> 
> https://www.reddit.com/r/OLED/comments/7oah8d/lg_oled_b7_black_level/
> 
> For whatever reason, these monitors or through the drivers are not auto sensing the signal in SDR. HDR works fine though.
> 
> For SDR, the only solution I've found is to run 120 Hz RGB Full 8-bit.


but the driver is outputting limited for sure - as i tried when connecting my laptop to my Oled and setting 4K 60hz 10bit 422 SDR via HDMI2.0...
maybe that is the bug in the firmware: the monitor autosenses limited, but is not matched with the outputting values between 16and235 - but instead has a raised minimum value from 20...21...22... or so - that would explain how blackcrush even is possible when getting fed with limited signal... because if treating limited signals via auto-adjust like FULL we would get washed out and bleach colors - and if treated like limited (the right way) it would be as intended...


----------



## MiniZaid

profundido said:


> No, the process of SLI involves an algorithm that severely bottlenecks the cpu. I second everything the others replied and just like them and you I ran into the limitations of this generation's bad SLI support (driver/game wise). I used to have a [email protected] setup too and saw great increase when replacing it with a [email protected] + [email protected] setup. Sad as it may seem singlethread core clock is still king in 2018 for most content and the in the newest optimized multicore content it doesn't matter since no game bottlenecks all 8 hyperthreaded @5+ghz on a 4-core cpu anyway.
> 
> This cpu's ring architecture (as opposed to mesh) combined with high clock and low latency does miracles to the fps of both old and new games. The fact that I lost 2*16xPCIE which became 2*8xPCIE on the new board made zero negative impact as demonstrated last year in Steve Burke's SLI scaling research.
> 
> and by the way: in the picture here you still see my 2-way Titan xp SLI. By now I have removed 1 and I'm running off 1 and 1 card only from now on. Not only did the overhead cause low fps gain (and even loss!!) in most games but in addition the microstuttering made the game look so much less fluent than with 1 card. I will reuse the second titan in another pc and I run 1 card max OC since the end of last year. SLI is officially on it's dying breath...we must accept it


i tested battlefield 1 for SLI. anything above a 1080 SLI needs to be ran x16/x16. Or else, you will get about 10-15% less fps at the same clock speed. This was tested at 1440p
I tested at multiplayer at the same spawn location on the same map.
Although from what I remember, 4k makes this difference less noticeable since GPU usage is drastically increased


----------



## bee144

MiniZaid said:


> profundido said:
> 
> 
> 
> No, the process of SLI involves an algorithm that severely bottlenecks the cpu. I second everything the others replied and just like them and you I ran into the limitations of this generation's bad SLI support (driver/game wise). I used to have a [email protected] setup too and saw great increase when replacing it with a [email protected] + [email protected] setup. Sad as it may seem singlethread core clock is still king in 2018 for most content and the in the newest optimized multicore content it doesn't matter since no game bottlenecks all 8 hyperthreaded @5+ghz on a 4-core cpu anyway.
> 
> This cpu's ring architecture (as opposed to mesh) combined with high clock and low latency does miracles to the fps of both old and new games. The fact that I lost 2*16xPCIE which became 2*8xPCIE on the new board made zero negative impact as demonstrated last year in Steve Burke's SLI scaling research.
> 
> and by the way: in the picture here you still see my 2-way Titan xp SLI. By now I have removed 1 and I'm running off 1 and 1 card only from now on. Not only did the overhead cause low fps gain (and even loss!!) in most games but in addition the microstuttering made the game look so much less fluent than with 1 card. I will reuse the second titan in another pc and I run 1 card max OC since the end of last year. SLI is officially on it's dying breath...we must accept it /forum/images/smilies/frown.gif
> 
> 
> 
> i tested battlefield 1 for SLI. anything above a 1080 SLI needs to be ran x16/x16. Or else, you will get about 10-15% less fps at the same clock speed. This was tested at 1440p
> I tested at multiplayer at the same spawn location on the same map.
> Although from what I remember, 4k makes this difference less noticeable since GPU usage is drastically increased
Click to expand...

This was going to be my next comment. I’m a BF die hard and I was testing BF1 with the PG27UQ.

Using 98hz HDR, I was only able to hit 75 FPS with SLI (8086k stock and SLI Titan x (Pascal)) at ultra settings.

Using just one card I am able to hit 85-90 FPS with ultra. CPU is at 30% usage the entire time.

SLI really has become a bag of hurt. Now I’ve got a $1,200 second GPU doing nothing...


----------



## Glerox

bee144 said:


> This was going to be my next comment. I’m a BF die hard and I was testing BF1 with the PG27UQ.
> 
> Using 98hz HDR, I was only able to hit 75 FPS with SLI (8086k stock and SLI Titan x (Pascal)) at ultra settings.
> 
> Using just one card I am able to hit 85-90 FPS with ultra. CPU is at 30% usage the entire time.
> 
> SLI really has become a bag of hurt. Now I’ve got a $1,200 second GPU doing nothing...


This is not normal even in x8/x8 config! I've tested BF1 multiple times with TITAN X Pascal in SLI (in x16/x16 on 6850k). I'm pretty sure you have TAA ON and this crushes the SLI scaling. With everything on ultra and no AA i'm getting stable 120 FPS in 4K. Can't wait to test on the PG27UQ which should arrive tomorrow, FINALLY.


----------



## HyperMatrix

There appear to be availability issues. BestBuy.ca went to backordered status, even for pre-order customers. Which sucks because I was supposed to have mine delivered tomorrow. I called them and they said the manufacturer sent them fewer units than originally expected and took $100 off the price of the monitor, saying it'll likely be another 2 weeks before stock becomes available. NewEgg.ca also changed their availability date from June 22nd to July 16th. I now wish I hadn't changed my preorder from Newegg to BestBuy. Because I would have received my unit by now. I wonder if any of this has to do with that whole firmware update mess that sellers in Europe were dealing with. At least paying $1880 USD for the monitor makes me feel a bit better. Assuming the delay is due to a firmware fix.


----------



## Glerox

FYI, it's in stock at canadacomputers if you want one now, which I'm sure you are.


----------



## bee144

Glerox said:


> This is not normal even in x8/x8 config! I've tested BF1 multiple times with TITAN X Pascal in SLI (in x16/x16 on 6850k). I'm pretty sure you have TAA ON and this crushes the SLI scaling. With everything on ultra and no AA i'm getting stable 120 FPS in 4K. Can't wait to test on the PG27UQ which should arrive tomorrow, FINALLY.


Ah! Thank you. I do believe I remember hearing about this limitation in the past. I'm at work right now but I'll be home in ~6 hours and will try then and report back.


----------



## Glerox

As already said :

"The problem is that the Asus PG27QU is currently making a mistake with the YCbCr signal. The screen cuts and what shades of dark gray are reproduced as the same black, called "black crush". The image gets too dark."

https://www.sweclockers.com/test/25...romisser-vid-144-hz-i-4k-upplosning/2#content

EDIT : They seem to say that black crush also happens in HDR and that you should stick to 98Hz for HDR. However it is not what users on this thread here have said, that you can use HDR in 144Hz for games. Strange.


----------



## CallsignVega

I'm not finding some of the same stuff. HDR works fine in 120 Hz RGB Full:




This is 144 Hz 4:2:2 in SDR:




This is 144 Hz 4:2:2 in HDR:




It only crushes blacks in SDR. Crappy pics, window blinds are open.


----------



## bee144

CallsignVega said:


> I'm not finding some of the same stuff. HDR works fine in 120 Hz RGB Full:
> 
> 
> 
> 
> This is 144 Hz 4:2:2 in SDR:
> 
> 
> 
> This is 144 Hz 4:2:2 in HDR:
> 
> 
> 
> 
> It only crushes blacks in SDR. Crappy pics, window blinds are open.


I thought we were told HDR would only work at 98Hz? I think it was Marshall himself over on the ROG forum.


----------



## CallsignVega

bee144 said:


> I thought we were told HDR would only work at 98Hz? I think it was Marshall himself over on the ROG forum.


Reference? The only reason to drop down to 98 Hz is if you want HDR at 4:4:4, everything else including 10-bit color is the same at HDR 144 Hz 4:2:2. It is somewhat of a moot point though at the moment, because the HDR games out there are fairly modern and demanding and basically no hardware will push well over 98 FPS at 4K anyway to really take advantage of it. 

The thing I am disappointed in is the black crush at 144 Hz SDR. Forcing you to use 120 Hz for SDR.


----------



## bee144

CallsignVega said:


> Reference? The only reason to drop down to 98 Hz is if you want HDR at 4:4:4, everything else including 10-bit color is the same at HDR 144 Hz 4:2:2. It is somewhat of a moot point though at the moment, because the HDR games out there are fairly modern and demanding and basically no hardware will push well over 98 FPS at 4K anyway to really take advantage of it.
> 
> The thing I am disappointed in is the black crush at 144 Hz SDR. Forcing you to use 120 Hz for SDR.


Ah, sorry for the confusion on my end. I meant 4:4:4 HDR was limited to 98Hz.

Do you think there is a big difference between 4:4:4 and 4:2:2? I can tell the difference when browsing the web but that's about it?


----------



## acmilangr

Blackvette94 said:


> Here are the pics of removing the anti glare on the pg27uq! By far the most difficult monitor or tv I have ever taken apart and I have done may 25-30 now.
> 
> The anti glare came off after 3hrs of damp paper towels with distilled water. No glue residue left over, just a pristine clear glass screen 🙂
> 
> Benefits of this mod:
> 
> Significant clarity due to high ppi 4k at 27 inches
> Significant increase in brightness
> Contrast increase is substantial
> Blacks look liquid now and picture overall looks like looking out a window :0
> 
> You need special plastic tools to open the case without damaging it, the ag filter is so very fine and cut in the exact shape of the polarizer that you have to be very careful when removing it!
> 
> I would be glad to do this mod for others on the x27 and pg27uq but I won’t be doing it for free due to the high degree of difficulty. This mod makes this display look next gen and shame on Asus for at least not giving us an option to have glossy vs matte 😞
> 
> Shame on Vega and l88bastard for not believing me that I did this :’(
> 
> Now onto the pics:


Great work. 
You have pm please check it.


----------



## kot0005

I can confirm what Vega said about black crush..


----------



## Blackvette94

CallsignVega said:


> I'm not finding some of the same stuff. HDR works fine in 120 Hz RGB Full:
> 
> 
> 
> 
> This is 144 Hz 4:2:2 in SDR:
> 
> 
> 
> 
> This is 144 Hz 4:2:2 in HDR:
> 
> 
> 
> 
> It only crushes blacks in SDR. Crappy pics, window blinds are open.


Dat anti glare doe 😞

Should have came without the ag filter as it looks not even close to the same with it off:


----------



## HyperMatrix

Glerox said:


> FYI, it's in stock at canadacomputers if you want one now, which I'm sure you are.


I would never buy from canada computers or anyone else who charges more for shipping the product than all other retailers, and on top of that asks you to pay $70 more to insure the shipment, or else you're SOL if the package never arrives. NCIX did the same thing. That's why I avoided them like the plague. They're only good to use for price matching with more reputable sellers.


----------



## CallsignVega

lol I've never heard of a real retailer charging extra for insurance. That definitely sounds like a crappy avoid retailer.


----------



## saltedham

i got mine delivered from microcenter today. when did they start doing online?

ive never had a 4k display or one with hdr. it looks just like that option in old valve games that added bloom to everything.


----------



## Sichtwechsel86

saltedham said:


> i got mine delivered from microcenter today. when did they start doing online?
> 
> ive never had a 4k display or one with hdr. it looks just like that option in old valve games that added bloom to everything.


hm... that bloom may be due to the FALD...

but in theory it should add luminance/light and enhances the perceived contrast....

as an example: 
in SDR a swordsmith puts red iron on the anvil to adjust it with his hammer...
in HDR the iron should look like it glows and is a lightsource itself....

same goes for flames and so on...

the bloom/haloing is due to FALD...

and of course HDR or HDR10 must be implemented with care...

i thought one very good movie to demonstrate HDR10 capabilities is 'Valerian' - 
or if you have some time: watch 'Altered Carbon' on NetFlix...
the series truly falls flat on characterization and the story takes unnecessary sidesteps and has big problems with keeping the tension due to its fragmented style...
but atmosphere and visuals are very very good...
i watched it on my OLED and was blown away by its DolbyVision implementation...
but i tried it on my Acer X27 just to compare it and would definitly say: 
even on this monitor and even though it's 'only' HDR10 - it shows some very good HDR scenes...

try it! 

but be aware of the limits of this monitor...
many nightshots with small pointlights will induce some clouding around them due to the big FALD zones...
but the daylightscenes are even better than on my OLED due to this monitors enhanced brightness...
and even colors and more come close to OLED in daylight scenes!


----------



## saltedham

are there recommended settings for this monitor? i usually would use tft centrals settings but they dont have a review up. i turned the temp to warm, lowered contrast, thats the white level right?


----------



## Sichtwechsel86

saltedham said:


> are there recommended settings for this monitor? i usually would use tft centrals settings but they dont have a review up. i turned the temp to warm, lowered contrast, thats the white level right?


i had my acer at 

brightness 100 nits
contrast 50

every pictureenhancer OFF (like black-level-enahncer, etc...)

color temp normal or warm (what you like)
SDR color sRGB ON
LOcal Dimming SDR ON
Local Dimming mode GAMING

in HDR - 

leave everything as you have set in SDR...
brightness now is adjusted by metadata... and greyed out!


----------



## kot0005

https://www.sweclockers.com/test/25...r-den-faktiskt-ar-som-bildskarm-och-lite-till

"ASUS ROG PG27UQ with Swift has crossed many borders and milestones which also made it painfully obvious how Displayport standard cannot keep up with developments. A fact which, moreover, is compounded by how PG27UQ becomes dark when we run 144 Hz and 120 Hz or HDR in 144 Hz"


----------



## Baasha

Okay so I'm now running 2x Titan Xp using HB SLI bridge the GPU usage is still pretty bad - 70 - 80% usage across both GPUs in BF1 maxed out (both with TAA on and off).

What is the 'best' or 'ideal' setting to use in desktop (outside of games)? I currently have it at 120Hz RGB 'Full' 8-bit. Is it better to run with HDR through Windows or just let the games/applications switch to HDR whenever necessary?

I just installed madVR and watched Alien Covenant - it looks amazing in HDR with MPC-HC (way better than SDR).

Tried playing AC Origins maxed out with a single Titan Xp and HDR - I get around 42FPS which is redonkulous. 

It just seems pathetic that we can't get 144hz/fps in 4K even if we have the GPU power to do so in 4K (easily). I was getting 80 - 100fps in 8k in BF1 maxed out (no AA) and 120fps in BF4 maxed out (no AA) also in 8K with 4 GPUs. 

Single GPU with this monitor seems to be entirely pointless, even with G-Sync. I mean, 60fps seems to be a tall ask in most recent games. You guys okay with getting sub-60fps in games (especially with BFV coming soon)? Turning down options (other than AA) seems to really defeat the purpose. It's like getting a fast car and suggesting going 30mph. 

This is super frustrating for me. 

Decisions decisions..


----------



## Baasha

Glerox said:


> This is not normal even in x8/x8 config! I've tested BF1 multiple times with TITAN X Pascal in SLI (in x16/x16 on 6850k). I'm pretty sure you have TAA ON and this crushes the SLI scaling. With everything on ultra and no AA i'm getting stable 120 FPS in 4K. Can't wait to test on the PG27UQ which should arrive tomorrow, FINALLY.


Turning TAA off didn't do a damn thing - the scaling is still garbage and I get around 80FPS with everything maxed out. Perhaps I should use 98hz setting in NVCP? I'm using 120Hz RGB 8-bit "full" right now in desktop and HDR10 in BF1. Also, what is the point of Dolby Vision HDR vs HDR10? What's better? My eyes can't tell a difference.



profundido said:


> No, the process of SLI involves an algorithm that severely bottlenecks the cpu. I second everything the others replied and just like them and you I ran into the limitations of this generation's bad SLI support (driver/game wise). I used to have a [email protected] setup too and saw great increase when replacing it with a [email protected] + [email protected] setup. Sad as it may seem singlethread core clock is still king in 2018 for most content and the in the newest optimized multicore content it doesn't matter since no game bottlenecks all 8 hyperthreaded @5+ghz on a 4-core cpu anyway.
> 
> This cpu's ring architecture (as opposed to mesh) combined with high clock and low latency does miracles to the fps of both old and new games. The fact that I lost 2*16xPCIE which became 2*8xPCIE on the new board made zero negative impact as demonstrated last year in Steve Burke's SLI scaling research.
> 
> and by the way: in the picture here you still see my 2-way Titan xp SLI. By now I have removed 1 and I'm running off 1 and 1 card only from now on. Not only did the overhead cause low fps gain (and even loss!!) in most games but in addition the microstuttering made the game look so much less fluent than with 1 card. I will reuse the second titan in another pc and I run 1 card max OC since the end of last year. SLI is officially on it's dying breath...we must accept it


That's very interesting, however the issue is mainly this monitor as I can run 4-Way SLI with amazing results in games that support SLI (even nominally since I tweak settings etc.) (see pics). I can run 8K (DSR) with maxed out settings with 4 GPUs at 99% usage which means the 6950X @ 4.30Ghz has no issues with CPU bottlenecking. The gameplay is very smooth.

This monitor seems to have a lot of trouble with SLI as even 2 way doesn't scale properly. Although I agree with you that the 8700K or 8086K is better for gaming, there is no reason why 2 GPUs should bottleneck the 6950X.

I'm curious as to your feedback on my earlier point (see scaling in pics).

All games below were played in 8K resolution (7680x4320) maxed out (no AA) with 4x Titan Xp in 4-Way SLI and 6950X @ 4.30Ghz:

GPU usage is 3rd column from left (percentages)

Outlast 2:










RE7:










RoTR:










The Witcher 3:










Max Payne 3:


----------



## profundido

MiniZaid said:


> i tested battlefield 1 for SLI. anything above a 1080 SLI needs to be ran x16/x16. Or else, you will get about 10-15% less fps at the same clock speed. This was tested at 1440p
> I tested at multiplayer at the same spawn location on the same map.
> Although from what I remember, 4k makes this difference less noticeable since GPU usage is drastically increased


true true. BF1 already exceptional because really good SLI support, and the higher the resolution the more the GPU's are taxed instead of the cpu's (relatively speaking). But my statement was for all games, not just BF1 and if you look at that then it's a disaster  BF1 is a real exception nowadays


----------



## profundido

bee144 said:


> This was going to be my next comment. I’m a BF die hard and I was testing BF1 with the PG27UQ.
> 
> Using 98hz HDR, I was only able to hit 75 FPS with SLI (8086k stock and SLI Titan x (Pascal)) at ultra settings.
> 
> Using just one card I am able to hit 85-90 FPS with ultra. CPU is at 30% usage the entire time.
> 
> SLI really has become a bag of hurt. Now I’ve got a $1,200 second GPU doing nothing...


this sounds like wrong settings/drivers/config because BF is known to scale alot better than that for sure regardless of PCIE bus speed. I don't play it but I'm sure the people here who do can tell you all the best settings to achieve best sli scaling with stable frametimes


----------



## Sichtwechsel86

One user over at the Asus forum got an email from his retailer...

They confirm that Asus is concerned about their QualityControl and the delay is due to them figuring out something...

no further information though...

User and retailer are from UK - so EU again...

but i really think this problem will effect all models...

especially if BlackCrush in 144hz SDR is the problem - as it cleary isn't supposed to be that way!


----------



## profundido

bee144 said:


> Thank you, this worked.
> 
> Next question, my monitor is currently in racing mode. How do I change it to another profile?


I assume you refer to a profile defined in the OSD of the monitor. Since I have the Acer they are called different on my screen. I would assume you just use the joystick to go to settings or advanced settings and changed the profile from racing to another. See the manual for details or perhaps someone here who has the Asus can answer this out of the top of their head


----------



## profundido

Baasha said:


> Okay so I'm now running 2x Titan Xp using HB SLI bridge the GPU usage is still pretty bad - 70 - 80% usage across both GPUs in BF1 maxed out (both with TAA on and off).
> 
> What is the 'best' or 'ideal' setting to use in desktop (outside of games)? I currently have it at 120Hz RGB 'Full' 8-bit. Is it better to run with HDR through Windows or just let the games/applications switch to HDR whenever necessary?
> 
> I just installed madVR and watched Alien Covenant - it looks amazing in HDR with MPC-HC (way better than SDR).
> 
> Tried playing AC Origins maxed out with a single Titan Xp and HDR - I get around 42FPS which is redonkulous.
> 
> It just seems pathetic that we can't get 144hz/fps in 4K even if we have the GPU power to do so in 4K (easily). I was getting 80 - 100fps in 8k in BF1 maxed out (no AA) and 120fps in BF4 maxed out (no AA) also in 8K with 4 GPUs.
> 
> Single GPU with this monitor seems to be entirely pointless, even with G-Sync. I mean, 60fps seems to be a tall ask in most recent games. You guys okay with getting sub-60fps in games (especially with BFV coming soon)? Turning down options (other than AA) seems to really defeat the purpose. It's like getting a fast car and suggesting going 30mph.
> 
> This is super frustrating for me.
> 
> Decisions decisions..


70 - 80% usage across both GPUs is considered gold nowadays. Not sure if you can squeeze much more out of it. Also keep in mind that you may see alot of usage on both gpu's nowadays and yet equal or even less fps in game. The % usage on both gpu's is no longer directly linearly related to fps in game. This is true for most games now where BF is rather an exception with it's optimized SLI scaling.

What is the 'best' or 'ideal' setting to use in desktop (outside of games)? I currently have it at 120Hz RGB 'Full' 8-bit. Is it better to run with HDR through Windows or just let the games/applications switch to HDR whenever necessary? 

=> definately HDR on in windows (use SDR slider to adjust brightness and OSD user color profile to tune beyond that) or most content just will not activate in HDR mode. Some games or software have built in HDR buttons that override (directly activate that same window setting in the background) but don't count on it. HDR default on in windows is the only way to never go wrong on content and enable all of it.

Tried playing AC Origins maxed out with a single Titan Xp and HDR - I get around 42FPS which is redonkulous. 

=> you are right, I get around 50 with 1 single titan xp and everything cranked up to full graphics

It just seems pathetic that we can't get 144hz/fps in 4K even if we have the GPU power to do so in 4K (easily). I was getting 80 - 100fps in 8k in BF1 maxed out (no AA) and 120fps in BF4 maxed out (no AA) also in 8K with 4 GPUs. 

=> I agree it seems so but in my past 25years of PC gaming it was only during the last few years in fact that the GPU for the first time in history have outperformed games. That we could crank all graphics up. I remember many many years of gaming with 25fps and no one complainted about it until the first Creative VOODOO 3DFX and Monster 3D Herculus cards arrived and caused a revolution. Even after that it was always a struggle trying to tune game graphics down to an acceptable level in order to reach 30+fps. I guess when it comes down to 4K content were back down a notch until better GPU solutions arrive 

It just seems pathetic that we can't get 144hz/fps in 4K even if we have the GPU power to do so in 4K (easily). I was getting 80 - 100fps in 8k in BF1 maxed out (no AA) and 120fps in BF4 maxed out (no AA) also in 8K with 4 GPUs. 

=> agreed. All this because SLI support for games is considered too much of hassle to maintain by the devs. And yet some premium games in the past and present have proven that it's quite possible. I guess they don't care anymore and we will be forced to adapt

Single GPU with this monitor seems to be entirely pointless, even with G-Sync. I mean, 60fps seems to be a tall ask in most recent games. You guys okay with getting sub-60fps in games (especially with BFV coming soon)? Turning down options (other than AA) seems to really defeat the purpose. It's like getting a fast car and suggesting going 30mph. 

=> the games I currently play most personally (SC2, ESO) have 80-120fps constantly on this monitor so you will not hear me complaint but for the newer games you're absolutely right. It sucks for now. Still SLI is really dead -dying exceptions aside- So My plan is to just wait a bit longer until the next big step up nvidia card comes out, then max OC it and tune settings down in order to reach an acceptable 80-90 fps for now. If I really really cannot wait for specific games I will just play them on my 1440p monitor but I have patience. I will 'save' those graphically amazing games for my next nvidia card upgrade for sure. 40fps is not acceptable anymore for me. Must be 60+ and stable frametimes or I rather play another game. I feel sorry for you who has had such an amazing great time during the golden 4xGPU SLI years. Once you reach the top, take care as the only way left to go is down 

Just be glad that game settings are no longer what they used to be. On 'medium' or even 'low' modern games can still look good especially in 4K. In the past any game would be reduced to garbage right ? So yeah back to tweaking and tuning days I guess ! =P

EDIT: I just thought of how you feel that the scaling is bad on this monitor. I don't know if you have another 4K monitor around but you could hook it up to rule out if it's the monitor causing extra delay. I would be surprised if that's the case but then again this g-sync module inside is new tech and might have impact...


----------



## CallsignVega

This is the FPS range I'm getting on an overclocked Titan V with all game settings maxed:

1. AC - Origins: 60-80 FPS.
2. Star CItizen arena commander - 70-90 FPS.
3. BF4 - 120-140 FPS.
4. SWBF II - 80-95 FPS.

PUBG I can run maxed out 142 FPS cap as I keep graphics settings low to see people better.

So yes, this monitor requires more processing power than the fastest GPU can put out right now. Maybe the next gen Titan will be sufficient.


----------



## keikei

https://www.tweaktown.com/news/62344/nvidias-new-sync-module-4k-144hz-monitors-worth-2000/index.html


----------



## Bloodmosher

*Initial thoughts after a week*

After a week of testing, here are my thoughts on this monitor:
1. BF1 in HDR is so good, I have a hard time playing in SDR
2. In BF1, I cannot see any noticeable difference between 98hz HDR 10-bit and 120hz HDR 8-bit w/dithering, so I play at 120hz
3. With a pair of Titan XPs in SLI I can get 95-105 fps on complex multiplayer maps, HDR 120hz Ultra everything except anti-aliasing post, which I leave at FXAA High
4. On the desktop, the only way I can match white brightness levels of my other monitor (PG27AQ) is to enable HDR and crank up the reference white setting to 100 nits. In SDR it is just a little dimmer than I would like.
5. Out of the box the colors are way too yellow, but this seems to be true for every monitor I buy. 
6. With HDR on, Chrome is gray, and the only way to fix it seems to be to disable hardware acceleration in Chrome. But this makes Chrome unusable in my opinion. Chrome needs to be fixed.
7. No panel is ever perfect in my experience, and this one seems to have some consistency issues, which for me is especially noticeable on white backgrounds (most of the web, Outlook, Excel, etc.). Basically, the lower half of the monitor looks more yellow than the top half when viewing whites. I haven't yet decided whether this is annoying enough to try to return/exchange. Like I said, no panel is ever perfect and all my other monitors have some similar consistency issue.
8. My settings are: HDR on, Nits 100, Contrast 55, (r,g,b): 53,53,100, Windows SDR brightness: 46

Anyone else notice the consistency issue I see in #7? Also, if you have settings you consider ideal, I would love to know what you are using.


----------



## Vipu

CallsignVega said:


> This is the FPS range I'm getting on an overclocked Titan V with all game settings maxed:
> 
> 1. AC - Origins: 60-80 FPS.
> 2. Star CItizen arena commander - 70-90 FPS.
> 3. BF4 - 120-140 FPS.
> 4. SWBF II - 80-95 FPS.
> 
> PUBG I can run maxed out 142 FPS cap as I keep graphics settings low to see people better.
> 
> So yes, this monitor requires more processing power than the fastest GPU can put out right now. Maybe the next gen Titan will be sufficient.


Dont max everything to have easy 100fps+ then.


----------



## kot0005

CallsignVega said:


> This is the FPS range I'm getting on an overclocked Titan V with all game settings maxed:
> 
> 1. AC - Origins: 60-80 FPS.
> 2. Star CItizen arena commander - 70-90 FPS.
> 3. BF4 - 120-140 FPS.
> 4. SWBF II - 80-95 FPS.
> 
> PUBG I can run maxed out 142 FPS cap as I keep graphics settings low to see people better.
> 
> So yes, this monitor requires more processing power than the fastest GPU can put out right now. Maybe the next gen Titan will be sufficient.




There was a protptype PCB leak earlier, 12Gb GDDR6 ram running at 14Gbps on 384bit bus..sweet 672Gbps bandwidth....if this config is the XX80 the titan will be insane lol prob 16gb 16gbps GDDR6


----------



## kot0005

new driver: https://www.techpowerup.com/245515/nvidia-releases-geforce-398-36-whql-drivers


ixed Issues in this Release
[Pascal GPUs][Gears of War 4][DirectX12]: Blue-screen crash may occur while playing the game. [2008731]
[Sterescopic 3D][NVIDIA Control Panel]: Switching the 3D display setting On and Off from the Windows display settings page has no effect on the NVIDIA Control Panel stereoscopic 3D settings page. [2045222]
[G-SYNC]: G-SYNC may still be active after closing a game, causing the desktop to stutter. []
[Surround]: Multiple games crash when launched in Surround mode. [2181329]
[HDR]: With HDR turned on, non HDR full-screen video playback may cause corruption/flickering in the video. [200410646]
[Notebook]: Blue-screen crash occurs with Driver_Power_State_Failure error upon bootup. [2168487]
Black Screen appears when booting Windows after installing the 397.93 display driver. [200418217]
Windows 10 Issues
[SLI][GeForce GTX 1080][G-SYNC][World of Warcraft: Legion][HDR]: Severe color corruption appears in the game after launching with the Windows HDR setting enabled. [200418344] This issue does not occur with the Windows HDR setting disabled.
[Surround][G-SYNC]: In Surround mode, the G-SYNC link in the NVIDIA Control Panel is missing. [200425004]
[Surround SLI][G-SYNC][Overwatch]: There is constant flickering in the game when played in a specific SLI Surround configuration (2 displays on master GPU and 1 display on slave GPU) with G-Sync mode enabled. [2130430]
[G-SYNC]: Windowed G-Sync mode may stutter after upgrading to Windows 10 Spring Creators Update. [2097340]
[Shadow Warrior 2][TITAN V]: After launching the game with Windows HDR setting enabled, there is flickering corruption when hovering the mouse over the game screen. [200408410] The issue does not occur with Windows HDR setting disabled.
[GeForce TITAN (Kepler-based)]: The OS fails after installing the graphics card on a Threadripper-enabled motherboard. [1973303]
[Netflix Edge Browser]: When playing a game in full-screen mode and playing a video from the Netflix Edge Browser, blue-screen crash occurs after multiple [Alt+Tab] switching between the two. [200415750] The issue does not occur when playing the Netflix video in a Chrome browser.
[Firefox]: Cursor shows brief corruption when hovering on certain links in Firefox. [2107201]
[NVIDIA Control Panel][Surround]: NVIDIA Surround hot keys do not work. [200394749]
Random DPC watchdog violation errors occur when using multiple GPUs on motherboards with PLX chips. [2079538]
Using power monitoring in GPU monitor tools causes micro stutter. [2110289/2049879]


----------



## guttheslayer

CallsignVega said:


> This is the FPS range I'm getting on an overclocked Titan V with all game settings maxed:
> 
> 1. AC - Origins: 60-80 FPS.
> 2. Star CItizen arena commander - 70-90 FPS.
> 3. BF4 - 120-140 FPS.
> 4. SWBF II - 80-95 FPS.
> 
> PUBG I can run maxed out 142 FPS cap as I keep graphics settings low to see people better.
> 
> So yes, this monitor requires more processing power than the fastest GPU can put out right now. Maybe the next gen Titan will be sufficient.


I have a question I heard about Chroma sub sampling for this monitor after 98Hz, so the only way u can get a true HDR 4:4:4 color is when you uses HDMI 2.1.


However since Nvidia doesnt allow their G-sync to be adopted on HDMI 2.1, does it mean the prospect of waiting for 4:4:4 on [email protected] G-sync panel will never happen, at least not before 2020?


----------



## Sichtwechsel86

guttheslayer said:


> I have a question I heard about Chroma sub sampling for this monitor after 98Hz, so the only way u can get a true HDR 4:4:4 color is when you uses HDMI 2.1.
> 
> 
> However since Nvidia doesnt allow their G-sync to be adopted on HDMI 2.1, does it mean the prospect of waiting for 4:4:4 on [email protected] G-sync panel will never happen, at least not before 2020?


you can run:

4K up to 120hz in 8bit RGB with SDR/HDR ---> this is the bandwidth limit of DP1.4 for 8bit color depth!
and
4K 144hz 8bit 4:2:2 SDR/HDR

or:

4K up to 98hz in 10bit RGB SDR/HDR ---> this is the bandwidth limit of DP1.4 for 10bit color depth!
and
4K 120hz 10bit 4:2:2 SDR/HDR
4K 144hz 10bit 4:2:2 SDR/HDR

via HDMI2.0 you can run 4K up to 60hz in 8bit RGB SDR/HDR and 4K up to 60hz in 10bit 4:2:2 SDR/HDR 

DP1.5 and HDMI2.0 will support higher bandwidth limits and therefore i assume you wont have to use chromasubsampling like you have to do right now!

But this monitor only has HDMI2.0 and DP1.4...
because HDMI2.1 and DP1.5 were not even finalized during the prdouction of this monitor

I would say: we will have to wait - as you said - until DP1.5 is finalized and first monitors and GPUs are using this new connection!
could be 2020 - 
on the otherhand you could use HDMI2.1 and VRR in 2019 for 4K 144hz 8bit/10bit 4:4:4 SDR/HDR - but that would not be GSync

So for GSync - the next two years we most likely have to stick with the options i mentioned above!


----------



## bee144

Glerox said:


> This is not normal even in x8/x8 config! I've tested BF1 multiple times with TITAN X Pascal in SLI (in x16/x16 on 6850k). I'm pretty sure you have TAA ON and this crushes the SLI scaling. With everything on ultra and no AA i'm getting stable 120 FPS in 4K. Can't wait to test on the PG27UQ which should arrive tomorrow, FINALLY.


Two of us are having SLI scaling issues with BF1. Please report back with your experience. Hopefully you have better luck. Turning off AA did not help.


----------



## bee144

Baasha said:


> Okay so I'm now running 2x Titan Xp using HB SLI bridge the GPU usage is still pretty bad - 70 - 80% usage across both GPUs in BF1 maxed out (both with TAA on and off).
> 
> What is the 'best' or 'ideal' setting to use in desktop (outside of games)? I currently have it at 120Hz RGB 'Full' 8-bit. Is it better to run with HDR through Windows or just let the games/applications switch to HDR whenever necessary?
> 
> I just installed madVR and watched Alien Covenant - it looks amazing in HDR with MPC-HC (way better than SDR).
> 
> Tried playing AC Origins maxed out with a single Titan Xp and HDR - I get around 42FPS which is redonkulous.
> 
> It just seems pathetic that we can't get 144hz/fps in 4K even if we have the GPU power to do so in 4K (easily). I was getting 80 - 100fps in 8k in BF1 maxed out (no AA) and 120fps in BF4 maxed out (no AA) also in 8K with 4 GPUs.
> 
> Single GPU with this monitor seems to be entirely pointless, even with G-Sync. I mean, 60fps seems to be a tall ask in most recent games. You guys okay with getting sub-60fps in games (especially with BFV coming soon)? Turning down options (other than AA) seems to really defeat the purpose. It's like getting a fast car and suggesting going 30mph.
> 
> This is super frustrating for me.
> 
> Decisions decisions..


I'm getting exactly the same results as you. I'm trying to run 98Hz 10 bit RGB HDR in BF1 and SLI scaling is horrible. I see about 75 FPS with 75% scaling on both cards. With just 1 card, I'm in the ~90FPS and 99% usage.

I understand SLI scaling might not mean a full 100% usage but single card shouldn't run better than two cards, no matter what.


----------



## CallsignVega

Vipu said:


> Dont max everything to have easy 100fps+ then.


I was actually surprised at how little FPS difference there was between Ultra settings in games and Low settings. Maybe it's a function of how my Titan V processes games. 



bee144 said:


> I'm getting exactly the same results as you. I'm trying to run 98Hz 10 bit RGB HDR in BF1 and SLI scaling is horrible. I see about 75 FPS with 75% scaling on both cards. With just 1 card, I'm in the ~90FPS and 99% usage.
> 
> I understand SLI scaling might not mean a full 100% usage but single card shouldn't run better than two cards, no matter what.


Single card will run faster than 2 cards if the SLI implementation isn't working.


----------



## bee144

ASUS Firmware update coming out later this year. Can be installed by end user with no need to ship back.

https://www.asus.com/support/FAQ/1036750

Opinion: Thank goodness. It's 2018. We should be able to install firmware updates on our own monitors without needing to send them back.


----------



## Glerox

bee144 said:


> Two of us are having SLI scaling issues with BF1. Please report back with your experience. Hopefully you have better luck. Turning off AA did not help.


Just received my PG27UQ!!! Wow it's amazing. Impressions coming.
I tested BF1, I still have good scaling >80% and 110 FPS with TAA OFF.
I have x16/x16 with [email protected] and my Titans XP are running at 2050 Mhz.

Questions for PG27UQ owners, what is your setting for brightness reference nits in HDR? I'm confused I tried BF1, Far Cry 5 and FF XV and it seems each game need a different reference nits, plus there is a HDR brightness setting in each game... needs a lot of tweaking lol.


----------



## Glerox

bee144 said:


> ASUS Firmware update coming out later this year. Can be installed by end user with no need to ship back.
> 
> https://www.asus.com/support/FAQ/1036750
> 
> Opinion: Thank goodness. It's 2018. We should be able to install firmware updates on our own monitors without needing to send them back.


Awesome news.


----------



## Sancus

CallsignVega said:


> Maybe it's a function of how my Titan V processes games.



Did they improve the drivers since release? I know I took one look at Gamers' Nexus Titan V review and decided it wasn't worth it. I'm all for throwing ridiculous money at things, but Titan V had way lower minimums than 1080 TI in many titles(like, less than 50% 0.1% and pretty bad 1% in Doom Vulkan and Destiny 2) and to me minimums are the most important thing, because as soon as you drop below 60 fps you notice a clear stutter. I'd rather have min 60fps and 80fps avg than 120fps avg with min 30fps, which is what Titan V provided in many cases. And all that for only $2K+ more.


I figured the drivers wouldn't actually improve until Volta showed up in a consumer card.


----------



## CallsignVega

Not sure who put out that fud. My V runs buttery smooth and no massive FPS drops. Then again I also have it paired with a 5.2 GHz 8700K with super fast memory. A lot of people test things on potato computers. And be wary of sole-source information, especially information that could easily be outdated (release drivers etc).


----------



## MiniZaid

bee144 said:


> This was going to be my next comment. I’m a BF die hard and I was testing BF1 with the PG27UQ.
> 
> Using 98hz HDR, I was only able to hit 75 FPS with SLI (8086k stock and SLI Titan x (Pascal)) at ultra settings.
> 
> Using just one card I am able to hit 85-90 FPS with ultra. CPU is at 30% usage the entire time.
> 
> SLI really has become a bag of hurt. Now I’ve got a $1,200 second GPU doing nothing...


for everyone with SLI issues, if you really want to get more fps, you have to get an x299 or even x99
you need the PCIe lanes. I've tested it. YOu will get more fps on a 5930K at 4.2ghz than a 8700K 4.8ghz SLI 1080s. So for titan x, i can only imagine

Also try to run your ram at 3000mhz or higher. High speed ram helps too.

There's nothing more to be done. I managed to get between 75-85% GPU usage.

You guys are not the only one with issues. Look at the following videos and you will see that BF1 just inherently doesn't scale well with high end cards


----------



## MiniZaid

HyperMatrix said:


> There appear to be availability issues. BestBuy.ca went to backordered status, even for pre-order customers. Which sucks because I was supposed to have mine delivered tomorrow. I called them and they said the manufacturer sent them fewer units than originally expected and took $100 off the price of the monitor, saying it'll likely be another 2 weeks before stock becomes available. NewEgg.ca also changed their availability date from June 22nd to July 16th. I now wish I hadn't changed my preorder from Newegg to BestBuy. Because I would have received my unit by now. I wonder if any of this has to do with that whole firmware update mess that sellers in Europe were dealing with. At least paying $1880 USD for the monitor makes me feel a bit better. Assuming the delay is due to a firmware fix.


They declined to give me $100 off. What did you say to them?


----------



## Baasha

bee144 said:


> I'm getting exactly the same results as you. I'm trying to run 98Hz 10 bit RGB HDR in BF1 and SLI scaling is horrible. I see about 75 FPS with 75% scaling on both cards. With just 1 card, I'm in the ~90FPS and 99% usage.
> 
> I understand SLI scaling might not mean a full 100% usage but single card shouldn't run better than two cards, no matter what.


I've run various modes - 10-bit 98Hz, 8-bit 120Hz, 8-bit 144Hz and none of them have good scaling - both GPUs still hover around 70 - 80% usage. FPS is around 90 - 95.

My 4K OLED monitor I get well over 120FPS in 4k with 2 GPUs with the same graphics settings in game (obviously, no HDR).

It is amply evident that SLI by itself is not the issue in BF1 (or any other game that supports SLI). This monitor, in conjunction with HDR, G-Sync, or some other setting/feature, is causing the SLI scaling to be adversely affected.

In my previous post with the pics, I've shown even 4-Way SLI scales wonderfully well on a "regular" 4k monitor - my 4K OLED Dell UP3017Q. The image looks much better on that monitor than it does on the PG27UQ, even with HDR. The main advantage of this monitor was supposed to be its 144hz refresh rate - something we are not able to take advantage of.

With my week of ownership, this monitor has been a monumental disappointment. Unless someone figures out how to 'unlock' multi-GPU use on this monitor, it is essentially pointless as even the upcoming 1180 or 2080 GPU will not be able to achieve 144fps with a single GPU (with maxed out settings).

Again, playing with gimped settings defeats the purpose and is not even worth considering.


----------



## CallsignVega

Just turn off HDR and G-Sync and see if scaling changes? A monitor has nothing to do with SLI scaling. Only the drivers could affect SLI scaling.


----------



## bee144

Baasha said:


> I've run various modes - 10-bit 98Hz, 8-bit 120Hz, 8-bit 144Hz and none of them have good scaling - both GPUs still hover around 70 - 80% usage. FPS is around 90 - 95...


Can you try it with HDR off? I thought I got better performance with HDR off but I forget and I'm at work right now. I'll check myself as well later tonight.

If HDR is the issue, then it certainly points to a driver/game issue and not a scaling/X16vsX8 issue.


----------



## Baasha

CallsignVega said:


> Just turn off HDR and G-Sync and see if scaling changes? A monitor has nothing to do with SLI scaling. Only the drivers could affect SLI scaling.


DUDE, I LOVE YOU (no ****). 

It's definitely G-Sync that's causing the 'issue.'

Kept HDR on, turned G-Sync off, get 99% usage across both GPUs and get ~ 150fps maxed out (no TAA) and 125fps maxed out with TAA.

This works at both 98Hz and 120Hz settings in game. However, turning it to 144Hz setting DESTROYS GPU usage!  It DOES NOT WORK at 4K 144Hz setting!!

Also, you NEED to have HDR turned on in desktop (through Windows) for some odd reason - if you don't, GPU usage is garbage no matter what setting you use.

Just tried all these various settings so I hope it helps anyone looking to use SLI.

Also, noob question - tried taking screenshots to show you guys but I keep getting black screens (is this because of HDR)?


----------



## Baasha

bee144 said:


> Can you try it with HDR off? I thought I got better performance with HDR off but I forget and I'm at work right now. I'll check myself as well later tonight.
> 
> If HDR is the issue, then it certainly points to a driver/game issue and not a scaling/X16vsX8 issue.


Will try it with HDR off but see my post above - it works if you turn G-Sync off - kinda makes sense since G-Sync limits FPS to prevent tearing (?).

EDIT: With HDR OFF, GPU usage is amazing even at 4K 144Hz mode - 99% across both GPUs.

So the VERDICT is: 

1.) 98Hz Mode: Leave HDR ON and turn G-Sync OFF

2.) 120Hz Mode: Leave HDR ON and turn G-Sync OFF

3.) 144Hz Mode: Turn HDR and G-Sync OFF

HDR & G-Sync OFF 4K 144Hz Mode:


----------



## Titanmode

kot0005 said:


> No point in using the monitor with FALD off..


 I didnt ask your opinion about the fald. I want to see if the Qc is still **** and is something they are checking for so i have the option of using it without the fald incase the bloom is an issue in certain cases.






Please someone take pics of the blacklight bleed with fald off......


----------



## CallsignVega

From what I remember when I had SLI, G-Sync worked fine. Maybe they broke SLI G-Sync recently?


----------



## MiniZaid

bee144 said:


> This was going to be my next comment. I’m a BF die hard and I was testing BF1 with the PG27UQ.
> 
> Using 98hz HDR, I was only able to hit 75 FPS with SLI (8086k stock and SLI Titan x (Pascal)) at ultra settings.
> 
> Using just one card I am able to hit 85-90 FPS with ultra. CPU is at 30% usage the entire time.
> 
> SLI really has become a bag of hurt. Now I’ve got a $1,200 second GPU doing nothing...





Baasha said:


> DUDE, I LOVE YOU (no ****).
> 
> It's definitely G-Sync that's causing the 'issue.'
> 
> Kept HDR on, turned G-Sync off, get 99% usage across both GPUs and get ~ 150fps maxed out (no TAA) and 125fps maxed out with TAA.
> 
> This works at both 98Hz and 120Hz settings in game. However, turning it to 144Hz setting DESTROYS GPU usage!  It DOES NOT WORK at 4K 144Hz setting!!
> 
> Also, you NEED to have HDR turned on in desktop (through Windows) for some odd reason - if you don't, GPU usage is garbage no matter what setting you use.
> 
> Just tried all these various settings so I hope it helps anyone looking to use SLI.
> 
> Also, noob question - tried taking screenshots to show you guys but I keep getting black screens (is this because of HDR)?



i'm pretty sure i tested it that with gsync off. I remember turning gsync off in nvidia control panel. No change in fps.
Although i only briefly tested that. Since it didn't work so i eliminated that variable.

But gsync off produces tearing though. higher fps with gsync off is almost like 20% lower fps with gsync on.
It's not that bad at 144fps but i can tell the difference.


----------



## bee144

CallsignVega, is it broken or is it working as intended? Could tearing occur beyond X% of usage?

Baasha, I will verify your findings when I get home.


----------



## Baasha

CallsignVega said:


> From what I remember when I had SLI, G-Sync worked fine. Maybe they broke SLI G-Sync recently?


Could definitely be that. New driver (as of today) could be a cause as well.



MiniZaid said:


> i'm pretty sure i tested it that with gsync off. I remember turning gsync off in nvidia control panel. No change in fps.
> Although i only briefly tested that. Since it didn't work so i eliminated that variable.
> 
> But gsync off produces tearing though. higher fps with gsync off is almost like 20% lower fps with gsync on.
> It's not that bad at 144fps but i can tell the difference.


Gameplay is buttery smooth for me using HDR in 120Hz mode - getting that or higher FPS makes it really fun - finally what this monitor was made for. Did you also make sure to have HDR turned ON in Windows?



bee144 said:


> CallsignVega, is it broken or is it working as intended? Could tearing occur beyond X% of usage?
> 
> Baasha, I will verify your findings when I get home.


Sounds good.

I'm going to test other games and see what's up as well.


----------



## bee144

Baasha said:


> Will try it with HDR off but see my post above - it works if you turn G-Sync off - kinda makes sense since G-Sync limits FPS to prevent tearing (?).
> 
> EDIT: With HDR OFF, GPU usage is amazing even at 4K 144Hz mode - 99% across both GPUs.
> 
> So the VERDICT is:
> 
> 1.) 98Hz Mode: Leave HDR ON and turn G-Sync OFF
> 
> 2.) 120Hz Mode: Leave HDR ON and turn G-Sync OFF
> 
> 3.) 144Hz Mode: Turn HDR and G-Sync OFF
> 
> HDR & G-Sync OFF 4K 144Hz Mode:


Good news! I had the exact same results as you. Turning off G-Sync for BF1 resulted in max FPS for each of your 3 scenarios. I did not run into any issues with the PCIe x8 limitation.

There certainly is a bug with G-Sync/BF1. How do we report this to NVIDIA?


----------



## Glerox

I can confirm the same results.

Went from 110 FPS to 130-160 FPS with g-sync OFF. 
Don't even need g-sync because it's steady over 120 FPS, using fast-sync.

I'm not surprised SLI+G-SYNC+HDR doesn't work well. It was the same with SLI at the beginning of regular G-SYNC years ago 

Definitely my last SLI setup lol.


----------



## kot0005

bee144 said:


> ASUS Firmware update coming out later this year. Can be installed by end user with no need to ship back.
> 
> https://www.asus.com/support/FAQ/1036750
> 
> Opinion: Thank goodness. It's 2018. We should be able to install firmware updates on our own monitors without needing to send them back.


I agree. Wen I bought the Acer x34, it had the banding issue with gradiation not being smooth. So I sent it back for firmware update. They not only did nothing to the firmware, but they damaged my monitor, the screen had scratches, the lower bezel was coming off, the back was full of scratches. I can still remember that nightmare!!

I returned it and bought a PG279Q.




bee144 said:


> Good news! I had the exact same results as you. Turning off G-Sync for BF1 resulted in max FPS for each of your 3 scenarios. I did not run into any issues with the PCIe x8 limitation.
> 
> There certainly is a bug with G-Sync/BF1. How do we report this to NVIDIA?



most likely place is their forums.


----------



## HyperMatrix

MiniZaid said:


> They declined to give me $100 off. What did you say to them?


I simply said this is unacceptable as I ordered back when there were preorder units available in stock, and that I cancelled an existing Newegg preorder that I would have had in my hands by now, and now I'm out a monitor because of them. I wasn't expecting a discount. He just offered it.


----------



## ttnuagmada

Glerox said:


> I can confirm the same results.
> 
> Went from 110 FPS to 130-160 FPS with g-sync OFF.
> Don't even need g-sync because it's steady over 120 FPS, using fast-sync.
> 
> I'm not surprised SLI+G-SYNC+HDR doesn't work well. It was the same with SLI at the beginning of regular G-SYNC years ago
> 
> Definitely my last SLI setup lol.


I've run dual GPU's for over a decade (4870's, 6950's, 780's, 980ti's, 1080ti's) and this is the last time I'm going dual card. Support has gotten worse with every generation. Even the games with SLI "support" usually have stuttering issues that make it not worth using half the time.


----------



## profundido

CallsignVega said:


> From what I remember when I had SLI, G-Sync worked fine. Maybe they broke SLI G-Sync recently?



in the past it always worked fine with the one and only existing g-sync module that could handle only up to 1440p. This one was redesigned from scratch to support higher bandwith and enable [email protected] So new growing pains are to be expected. I think we may have just stumbled upon one


----------



## profundido

bee144 said:


> Good news! I had the exact same results as you. Turning off G-Sync for BF1 resulted in max FPS for each of your 3 scenarios. I did not run into any issues with the PCIe x8 limitation.
> 
> There certainly is a bug with G-Sync/BF1. How do we report this to NVIDIA?


Awesome news guys. Thanks both for testing this. Now that at least 2 people have tested it seems clear that something in the flow of HDR+gsync is causing this. It will take nvidia driver updates and possible future firmware updates to our monitors in order to iron this out.

Will any of you report this or make a case with nvidia support ? You might have stumbled upon new ground. The sooner nvidia driver devs become aware -if not already- the sooner they can start developing drivers for it right ?

by the way PCIE x8 in the current state of gaming is NEVER a limitation for fps, even in 2018. I link below the brilliant research on this by Steve Burke that is still relevant in 2018

https://www.gamersnexus.net/guides/2488-pci-e-3-x8-vs-x16-performance-impact-on-gpus

Bottlenecks do exist but only start occurring by combining 2 Titan V's in a specific worst-case lab setup that is 'optimized' in order to find some limit. Currently only 1 game sees the second volta (ashes of singularity) to begin with.

https://www.gamersnexus.net/guides/3176-dual-titan-v-bandwidth-limit-test-x8-vs-x16

So Practically we will need to test this again once dual Volta/turing SLI with NVLINK with proper game/driver support becomes a common thing. Still a long way to go in other words


----------



## guttheslayer

Sichtwechsel86 said:


> you can run:
> 
> 4K 144hz 8bit 4:2:2 SDR/HDR
> 
> or:
> 
> 4K 144hz 10bit 4:2:2 SDR/HDR
> 
> I would say: we will have to wait - as you said - until DP1.5 is finalized and first monitors and GPUs are using this new connection!
> could be 2020 -
> on the otherhand you could use HDMI2.1 and VRR in 2019 for 4K 144hz 8bit/10bit 4:4:4 SDR/HDR - but that would not be GSync
> 
> So for GSync - the next two years we most likely have to stick with the options i mentioned above!


Thanks for the info, but I have to add that if you are forced in 4:2:2 sampling, you might as well go 10-bits colour.

And oh gosh! 2 years at least for DP 1.5, Seriously, Nvidia need to ditch its $500 module and implement VRR on HDMI 2.1. I mean come on! The module is not helping and Intel is taking a big chunk of the profit as well! VRR need to work on Nvidia card over HDMI 2.1!



Baasha said:


> So the VERDICT is:
> 1.) 98Hz Mode: Leave HDR ON and turn G-Sync OFF
> 2.) 120Hz Mode: Leave HDR ON and turn G-Sync OFF
> 3.) 144Hz Mode: Turn HDR and G-Sync OFF


Its abit sad consumer pay $500 for that G-Sync module and yet they are force to turn it off.



Baasha said:


> I've run various modes - 10-bit 98Hz, 8-bit 120Hz, 8-bit 144Hz and none of them have good scaling - both GPUs still hover around 70 - 80% usage. FPS is around 90 - 95.
> 
> My 4K OLED monitor I get well over 120FPS in 4k with 2 GPUs with the same graphics settings in game (obviously, no HDR). It is amply evident that SLI by itself is not the issue in BF1 (or any other game that supports SLI). This monitor, in conjunction with HDR, G-Sync, or some other setting/feature, is causing the SLI scaling to be adversely affected.
> 
> Again, playing with gimped settings defeats the purpose and is not even worth considering.



I had to be blunt, at $2000 a panel, End user shouldn't even experience any of such problems. We all paid the best money for the best luxury with the least hassle. Nowadays its seem to be the opposite.

I was hopeful for the upcoming 32" panel version, but seeing how bad it is now, I can forget it especially it is still stuck on DP 1.4. I think OLED display is really the way to go. It going to take years before all this HDMI/DP, HDR, VRR will work together seamlessly.


----------



## guttheslayer

Btw what is the difference between HDR 8 bits and HDR 10 bits? Panel using 8 bits can be counted as HDR as well?


And can we set this display on window or the display button to 98Hz?


----------



## Sichtwechsel86

guttheslayer said:


> Btw what is the difference between HDR 8 bits and HDR 10 bits? Panel using 8 bits can be counted as HDR as well?
> 
> 
> And can we set this display on window or the display button to 98Hz?


HDR is HDR (high dynamic contrast) -
and it doesn't matter if combined with 8bit (256 shades of every RGB color) or 10bit (1024 shades) or what colorspace...

HDR10 instead is a norm/defined standard (combining HDR, 10bit and rec2020...)

so a panel with a 8bit can give you HDR - but not HDR10 as it is intended!



Refreshrate and other settings you can find in the Nvidia Control Panel - 
you don't set it in the OSD of this monitor...

HDR ON/OFF via Windows settings... and inGame settings...


----------



## ToTheSun!

Sichtwechsel86 said:


> HDR is HDR (high dynamic contrast) -
> and it doesn't matter if combined with 8bit (256 shades of every RGB color) or 10bit (1024 shades) or what colorspace...


And here I thought it was High Dynamic Range and that it DID pertain to color depth and color space.


----------



## acmilangr

Question please:

If we choose 4:4:4 chroma,HDR ON, 8 bit, what is the maximum refresh rate we can get?

Edit: I will answer to myself!

these are the limitations:
SDR
10bit-98hz-4:4:4
8bit-120hz-4:4:4
8bit-144hz-4:2:2

HDR
10bit-98hz-4:4:4
8bit*-120hz-4:4:4
10bit-120hz-4:2:2
10bit-144hz-4:2:2

*dithered, only with Win10 RS4

So new question. What 8bit dithered means? what is the result?how this affect the quallity?


----------



## mmms

acmilangr said:


> Question please:
> 
> If we choose 4:4:4 chroma,HDR ON, 8 bit, what is the maximum refresh rate we can get?


120HZ


----------



## Sichtwechsel86

ToTheSun! said:


> And here I thought it was High Dynamic Range and that it DID pertain to color depth and color space.


As far as i know HDR10 does combine HDR, rec2020 color space and 10bit color-depth...

HDR itself is just ONE feature of this standard...

HDR means luminance levels from 0-1000 nits or more...
the brightness levels are adjustet via metadata in the source material...

while defining the HDR10 standard - 
devs decided to use a wider color space as sRGB, to reduce the effect that colors occured washed out from a certain brightness level and higher... also they decided that 1024 shades of colors (aka 10bit color depth) would help against even more obvious banding with higher brightness and more defined black levels...

HDR itself doesn't rely on 10bit color depth and rec2020 color space...
but HDR10 combines all these factors as a defined standard...

also...
switching HDR ON in windows leads indeed to a change from sRGB to rec2020...
but that is again how they implemented it...

it would be doable without a problem, to run HDR with sRGB and 8bit color...
just as an added feature for more contrast...

all other things (wider colorspace, 10bit color depth) do exist independently...
HDR10 just combines all these features!

so - to cut it short:
HDR itself does not pertain to enhanced color space and color depth 
the HDR10 standard does!

found a link that explains it better than me:

https://www.eizo.de/praxiswissen/monitorwissen/hdr-im-detail-was-ist-hdr/

it is about the 5 factors that enhance picture-quality (resolution, brightness, color gamut, frame rate, bit depth) and how HDR10 did take 3 out of 5 to give a significant difference between 'older' displays and material ('standard' FHD TV, BluRay, etc...) and newer ones (HDR TV and UHD BluRay, HDR movies...)


----------



## acmilangr

Did anyone try to compare 4:4:4 vs 4:4:2 on the monitor? is there any difference?


----------



## Sichtwechsel86

acmilangr said:


> Did anyone try to compare 4:4:4 vs 4:4:2 on the monitor? is there any difference?


yes it is visible on desktop but not so much while watching movies or playing games!


----------



## ToTheSun!

Sichtwechsel86 said:


> As far as i know HDR10 does combine HDR, rec2020 color space and 10bit color-depth...


I suppose that's true. I was hung up on your acronym decode.


----------



## guttheslayer

ToTheSun! said:


> I suppose that's true. I was hung up on your acronym decode.


HDR alone, from what I understand from a tech site is the contrast difference between the blackest black, and the whitest white. If a certain contrast different is met, the HDR standard is met. Both HDR and 10 bits eat up a portion of the cable bandwidth.


HDMI 2.1 is the only connection that can support full HDR10 on 4K 144Hz, yet it doesnt support G-sync, or any VRR on Nvidia card. This is what make me frustrated.



Also didnt DP 1.4 promises DSC, (DSC is what separate DP 1.4 from DP 1.3). What happen to it?


----------



## MiniZaid

for games, is there difference between HDR 8bit vs 10bit? I thought games are coded in 8 bit colour. Are some games patched to 10bit if HDR is on?


----------



## Sichtwechsel86

guttheslayer said:


> Also didnt DP 1.4 promises DSC, (DSC is what separate DP 1.4 from DP 1.3). What happen to it?


DSC just seems to be not compatible with GSync and adds to the inputlag in general...
so DSC is for sure good for 'video content' but not so much for interactive gameplay...


----------



## Sichtwechsel86

MiniZaid said:


> for games, is there difference between HDR 8bit vs 10bit? I thought games are coded in 8 bit colour. Are some games patched to 10bit if HDR is on?


that's the question...

i really don't know if every game stays true to these specific HDR10 demands (HDR, rec2020 and 10bit)

i asked a few post ago, 
if someone knows if bitdepth is processed on the GPU (and therefore somewhat scalable from 6-12bit or so) 
or if colors are instead graded in production of the game...

same question for color space...
are games graded in sRGB AND rec2020 (DCI-P3)
or is the GPU adjusting the sRGB colorspace to rec2020 - which would be an approximation of values - instead of really using the wider color gamut...


----------



## bee144

Baasha said:


> Could definitely be that. New driver (as of today) could be a cause as well.
> 
> 
> 
> Gameplay is buttery smooth for me using HDR in 120Hz mode - getting that or higher FPS makes it really fun - finally what this monitor was made for. Did you also make sure to have HDR turned ON in Windows?
> 
> 
> 
> Sounds good.
> 
> I'm going to test other games and see what's up as well.





Glerox said:


> I can confirm the same results.
> 
> Went from 110 FPS to 130-160 FPS with g-sync OFF.
> Don't even need g-sync because it's steady over 120 FPS, using fast-sync.
> 
> I'm not surprised SLI+G-SYNC+HDR doesn't work well. It was the same with SLI at the beginning of regular G-SYNC years ago
> 
> Definitely my last SLI setup lol.


I contacted NVIDIA support. They had me 'manage 3d settings' for BF1 in the control panel. No luck with their settings. They then had me run DDU and again no luck.


I'm running the latest version of windows (1803) with all patches, including yesterday's cumalative update installed.


They agreed to submit this as a bug and their team is looking into it now. They'll email me back with their findings. Let's cross our fingers. It would also probably help if they received other similar reports from both of you. Having worked for Microsoft support in the past, the more reports of an issue that we received, the more attention it got from the product group. http://www.nvidia.com/object/support.html

Also, I got an invite to the BFV Closed Alpha. Starting at 1:00 AM PDT tonight, we'll see if the issue peristists in the new game as well. BFV is NVIDIA sponsored where as BF1 was not??? We'll see...


----------



## Baasha

bee144 said:


> I contacted NVIDIA support. They had me 'manage 3d settings' for BF1 in the control panel. No luck with their settings. They then had me run DDU and again no luck.
> 
> 
> I'm running the latest version of windows (1803) with all patches, including yesterday's cumalative update installed.
> 
> 
> They agreed to submit this as a bug and their team is looking into it now. They'll email me back with their findings. Let's cross our fingers. It would also probably help if they received other similar reports from both of you. Having worked for Microsoft support in the past, the more reports of an issue that we received, the more attention it got from the product group. http://www.nvidia.com/object/support.html
> 
> Also, I got an invite to the BFV Closed Alpha. Starting at 1:00 AM PDT tonight, we'll see if the issue peristists in the new game as well. BFV is NVIDIA sponsored where as BF1 was not??? We'll see...


Interesting... I'll submit a report as well.

How'd you manage to get an invite to the BFV Closed Alpha? Was there something you had to do (fill out a form etc.)?


----------



## bee144

Baasha said:


> bee144 said:
> 
> 
> 
> I contacted NVIDIA support. They had me 'manage 3d settings' for BF1 in the control panel. No luck with their settings. They then had me run DDU and again no luck.
> 
> 
> I'm running the latest version of windows (1803) with all patches, including yesterday's cumalative update installed.
> 
> 
> They agreed to submit this as a bug and their team is looking into it now. They'll email me back with their findings. Let's cross our fingers. It would also probably help if they received other similar reports from both of you. Having worked for Microsoft support in the past, the more reports of an issue that we received, the more attention it got from the product group. http://www.nvidia.com/object/support.html
> 
> Also, I got an invite to the BFV Closed Alpha. Starting at 1:00 AM PDT tonight, we'll see if the issue peristists in the new game as well. BFV is NVIDIA sponsored where as BF1 was not??? We'll see...
> 
> 
> 
> Interesting... I'll submit a report as well.
> 
> How'd you manage to get an invite to the BFV Closed Alpha? Was there something you had to do (fill out a form etc.)?
Click to expand...

I think you just needed an EA account with the ability to receive emails from EA turned on. According to this dev tweet as least: https://twitter.com/jaqubajmal/status/1012005539642036224?s=21


----------



## Glerox

Anybody able to play Netflix in 4K HDR with this monitor?

I tried Netflix app and Edge browser, stuck to HD.

With DP cable, 6850K, Titan XP and latest drivers/updates.


----------



## bee144

Glerox said:


> Anybody able to play Netflix in 4K HDR with this monitor?
> 
> I tried Netflix app and Edge browser, stuck to HD.
> 
> With DP cable, 6850K, Titan XP and latest drivers/updates.


I can only get 1080p to play in Edge or the windows store app. hmmm do we need to use HDMI?


----------



## kx11

Glerox said:


> Anybody able to play Netflix in 4K HDR with this monitor?
> 
> I tried Netflix app and Edge browser, stuck to HD.
> 
> With DP cable, 6850K, Titan XP and latest drivers/updates.





did you try the Test pattern videos ?!


----------



## HyperMatrix

Posted this in the X27 thread, but possibly important enough to post here. Can anyone test my theorycrafting since I don't have my monitor yet?

Wouldn't it make the most sense to have the desktop setting stay at 98Hz 10-bit 4:4:4 HDR, and just set in-game settings to 144Hz 4:2:2? Or would there be an issue with that? I wouldn't want to have to change settings each time. And under those conditions, I'd actually be very happy with the limitations of this monitor. Because for gaming, realistically, I doubt I'd be able to notice much of a difference dropping from 4:4:4 to 4:2:2.

Also...considering I'm unlikely to be able to max out any game at 4K 144Hz right now, would I not be able to maintain 144Hz 4:4:4 HDR if I dropped down to either 2560x1440 or 3200x1800? I know Vega said due to the pixel density of 4k on a 27" screen, 2560x1440 looked superb for him. So why not go up a tad? 4:2:2 uses about 1/3rd less bandwidth than 4:4:4. Or, another way to look at it, is that 4:4:4 requires 50% more bandwidth than 4:2:2. Which means 2560x1440,4:4:4 @144Hz is the same bandwidth as 3840x2160,4:4:4 @65Hz. And 3200x1800,4:4:4 @144Hz is the same bandwidth as 3840x2160,4:4:4 @ 100Hz. 

Now, assuming we have a little wiggle room, especially with the use of reduced blanks, we should be able to squeeze 100Hz worth of 4k,4:4:4 HDR bandwidth. Or worst case scenario, do 141Hz refresh rate. But that way, we can have a constant desktop/gaming setting profile that allows full access to HDR, FALD, Quantum Dot, with 10-bit color, at 56.25% higher density than current 1440P displays most of us are probably using right now, while still getting 144Hz-ish refresh rates.



UPDATE: Bestbuy.ca has now removed the pg27uq from their website completely.


----------



## saltedham

Glerox said:


> Anybody able to play Netflix in 4K HDR with this monitor?
> 
> I tried Netflix app and Edge browser, stuck to HD.
> 
> With DP cable, 6850K, Titan XP and latest drivers/updates.


are you on the UHD netflix plan? or the regular 1080p plan? did you turn on hdr first in windows first before opening up netflix app?


----------



## Sichtwechsel86

HyperMatrix said:


> Posted this in the X27 thread, but possibly important enough to post here. Can anyone test my theorycrafting since I don't have my monitor yet?
> 
> Wouldn't it make the most sense to have the desktop setting stay at 98Hz 10-bit 4:4:4 HDR, and just set in-game settings to 144Hz 4:2:2? Or would there be an issue with that? I wouldn't want to have to change settings each time. And under those conditions, I'd actually be very happy with the limitations of this monitor. Because for gaming, realistically, I doubt I'd be able to notice much of a difference dropping from 4:4:4 to 4:2:2.


I like the enhanced smoothness of 120hz in desktop use...
also there is no real reason to run windows-desktop in 10bit... 
because i am not a content creator and nearly all my sources (except UHD movies) are 8bit anyways...

i stick with 4K 120hz 8bit RGB rec709/sRGB for desktop and SDR gaming...
and switch to 4K 98hz 10bit RGB rec2020 for HDR gaming...

also: if HDR is ON in windows it effects all SDR material in a bad way (washed out look, different colors because source is sRGB and windows and monitor now use rec2020)
yes you can enhance the brightness for SDR content via slider in windows, but that doesn't change the now used rec2020 color space - which has an effect on sRGB colors...
(sRGB colors in rec2020 colorspace look overbloated, oversaturated and just wrong... faces get red, skies get more than blue - it really isn't what u want...and not true to the source)

my advice is:
stay true to the source!

Desktop and SDR gaming - both 4K 120hz 8bit RGB rec709/sRGB
HDR sources: 4K 98hz 10bit RGB rec2020 for HDR gaming AND 4K 24hz 10bit RGB rec2020 for HDR movies


----------



## HyperMatrix

Sichtwechsel86 said:


> also: if HDR is ON in windows it effects all SDR material in a bad way (washed out look, different colors because source is sRGB and windows and monitor now use rec2020)
> yes you can enhance the brightness for SDR content via slider in windows, but that doesn't change the now used rec2020 color space - which has an effect on sRGB colors...
> (sRGB colors in rec2020 colorspace look overbloated, oversaturated and just wrong... faces get red, skies get more than blue - it really isn't what u want...and not true to the source)


This is a bug and is fixed in the new firmware for all monitors shipping in a couple weeks.


----------



## Sichtwechsel86

Glerox said:


> Anybody able to play Netflix in 4K HDR with this monitor?
> 
> I tried Netflix app and Edge browser, stuck to HD.
> 
> With DP cable, 6850K, Titan XP and latest drivers/updates.


You need Netflix UHD Abo, 

for HDR you have to switch HDR ON via windows,
then open the Netflix Windows App, 
and watch HDR content like 'Altered Carbon', etc... 

for me it worked perfectly...

newest windows, 
newest GPU drivers, 

i7 5820K
16 GB DDR4 RAM
GTX 1080 Amp Extreme connected to X27 via DP1.4

youtube HDR videos work for you???

same procedure here: first witch HDR on via windows, then open chrome and watch a youtube hdr video... 
i recommend the 'TechnoZen Gaming' channel! 
Lots of HDR Let's Plays on his channel... even his intro-video is HDR-graded!

has anyone an idea of how to record HDR-gameplay-videos and in which format one has to upload these on yt?
couldn't find it out ... the only thing i found was an external Atom Fire capture-card, which works via HDMI and costs at around 1200€ (which is way too much)


----------



## Sichtwechsel86

HyperMatrix said:


> This is a bug and is fixed in the new firmware for all monitors shipping in a couple weeks.


no that's not the bug they are fixing...

they are fixing the mismatched sRGB YUV gamma-curve which leads to blackcrushing if using chroma-subsampling in 4K 144hz 8Bit 4:2:2 SDR

also sRGB in rec2020 looking oversaturated and wrong is not a bug!
it's just how it works...
auto-converting sRGB into rec2020 leads to the following effect:

every max wavelength of sRGB will be shown as max wavelength of rec2020...
but max reds, blues and greens were never meant to be the same max that rec2020 then shows...

the values of sRGB colors just don't apply to the same values of rec2020!

that's one reason to stay true to the source...
and one reason why professional monitors have different options for different color spaces (sRGB, AdobeRGB, DCI-P3, rec2020)
because displaying sRGB in lets say DCI-P3 colorspace just looks wrong... 
with some exceptions all different colorspaces do use other values for the exact same color...
and you would have to emulate sRGB colorspace within DCI-P3 to adjust it - but that is not what happens!
and therefore - if you want to see colors as they are intended - use the right colorspace for the right source!


----------



## HyperMatrix

Sichtwechsel86 said:


> no that's not the bug they are fixing...
> 
> they are fixing the mismatched sRGB YUV gamma-curve which leads to blackcrushing if using chroma-subsampling in 4K 144hz 8Bit 4:2:2 SDR


Sorry can you explain that to me? HDR itself can run properly with 12 bit RGB or YCbCr444 for example. You can output it to a TV that handles the input and it looks great. But if I'm not mistaken, HDR10 spec doesn't get triggered unless there is accompanying metadata present. So if the content you're watching isn't providing said metadata, then you shouldn't have the issue of switching to rec2020 while viewing SDR content. Trying to learn here so please fill me in one what I'm missing here.

edit: I think the reason I didn't run into this issue when hooked up to my TV is because I never activated the windows HDR mode. I just enabled HDR inside games that support it. That seems a lot more logical. But how come my TV has the option to use the native color space or automatically adjust the color space even in HDR mode (which results in a noticeable difference)? Is that something that my TV can do, but this monitor is incapable of?


----------



## Sichtwechsel86

HyperMatrix said:


> Sorry can you explain that to me? HDR itself can run properly with 12 bit RGB or YCbCr444 for example. You can output it to a TV that handles the input and it looks great. But if I'm not mistaken, HDR10 spec doesn't get triggered unless there is accompanying metadata present. So if the content you're watching isn't providing said metadata, then you shouldn't have the issue of switching to rec2020 while viewing SDR content. Trying to learn here so please fill me in one what I'm missing here.
> 
> edit: I think the reason I didn't run into this issue when hooked up to my TV is because I never activated the windows HDR mode. I just enabled HDR inside games that support it. That seems a lot more logical.


it seems that you mistake some things with each another... (if that's grammatically correct and you know what i mean?)

first: HDR as you said correctly is high dynamic contrast and needs metadata...
second: rec709 and rec2020 are colorspaces...

third: HDR itself could run with up to 16bit RGB if bandwidth would not be a problem ! for now HDMI2.0 is up to 4K 60hz 12bit 4:2:2 HDR and DP1.4 up to 4K 144hz 4:2:2 12bit HDR

the thing is: 
switching ON HDR in windows - does change colorspace at the same time!
so yes - the windows slider effects general brightness for SDR content while HDR is switched ON ... no metadata no brightness-fluctuation 
BUT: because windows also changes colorspace with this very switch from sRGB to rec2020 colors do look mismatched - for reasons i explained above...

if windows wouldn't change colorspace (simply by adding an extra switch for changing colorspaces...)
you could do, what you described... given that you adjust the windows brightness slider for SDR content in the right way!
but as it is now... you really have to switch your settings every time to match your source material...

of course you can run 4K 98hz 10bit RGB all the time...
and then just switch HDR on/off via windows - 
but you will miss out on the extra smoothness due to a possible higher framerate...
and using 10bit for 98% 8bit sources is just unnecessary

and yes, TV's have these options 
my LG B7 OLED also does have three colorspace options: 
Auto = normal = rec709 (bug in the firmware they never fixed, but communicated! - it really does not automatically switch colorspaces!), 
enhanced (inbetween sRGB and rec2020), 
wide = nativ = rec2020

this monitor just has the option to use sRGB for SDR content automatically...
but if windows is switched to HDR ON, the monitor gets information for colorspace rec2020 and therefore changes to rec2020...

inGame HDR settings sometimes work in a way, where they just change HDR itself with leaving colorspace and color-bitdepth untouched...
BUT
if the HDR option in the game is intended to run with rec2020 and 10bit - and was graded in HDR10 standards...
then u miss out on colorshades and get inverse mismatched colors (rec2020 color values are just taken as if they are values inside sRGB, which leads to other errors)
some games do switch HDR and colorspace via inGameSettings too...
but most games stick to SDR if HDR in windows is not enabled...

best would be, that games in exclusive fullscreen would automatically switch to 4K 144hz 10bit 422 HDR if switched on - and the monitor adjusts itself to it!
but then they would have to override the GPU drivers...
and for these specific gaming displays and gpus - HDR grading happens within the GPU driver - not within the monitor as it would happen on TVs getting a HDR10 stream from console...
the reason for that is, that HDR10 adds to inputlag - and nvidia and amd decided for pc-ecosystem everything should happen within the GPU driver to keep the inputlag low...

for all these reasons it is best to manually switch settings based on the source material...


----------



## guttheslayer

Sichtwechsel86 said:


> it seems that you mistake some things with each another... (if that's grammatically correct and you know what i mean?)
> 
> first: HDR as you said correctly is high dynamic contrast and needs metadata...
> second: rec709 and rec2020 are colorspaces...
> 
> third: HDR itself could run with up to 16bit RGB if bandwidth would not be a problem ! for now HDMI2.0 is up to 4K 60hz 12bit 4:2:2 HDR and DP1.4 up to 4K 144hz 4:2:2 12bit HDR
> 
> the thing is:
> switching ON HDR in windows - does change colorspace at the same time!
> so yes - the windows slider effects general brightness for SDR content while HDR is switched ON ... no metadata no brightness-fluctuation
> BUT: because windows also changes colorspace with this very switch from sRGB to rec2020 colors do look mismatched - for reasons i explained above...
> 
> if windows wouldn't change colorspace (simply by adding an extra switch for changing colorspaces...)
> you could do, what you described... given that you adjust the windows brightness slider for SDR content in the right way!
> but as it is now... you really have to switch your settings every time to match your source material...
> 
> of course you can run 4K 98hz 10bit RGB all the time...
> and then just switch HDR on/off via windows -
> but you will miss out on the extra smoothness due to a possible higher framerate...
> and using 10bit for 98% 8bit sources is just unnecessary


So the best recommended setting for this display during gaming is to either run:

1) 98Hz with everything on.

OR

2) 120Hz with 8 bits HDR but everything else enabled.


BTW I thought option 1 is better because there isnt any GPU that can push all games to min. 98 FPS. Or rather, how many modern games can Titan V push that will exceed 98 FPS on 4K with max setting (no AA)?


----------



## Sichtwechsel86

guttheslayer said:


> So the best recommended setting for this display during gaming is to either run:
> 
> 1) 98Hz with everything on.
> 
> OR
> 
> 2) 120Hz with 8 bits HDR but everything else enabled.
> 
> 
> BTW I thought option 1 is better because there isnt any GPU that can push all games to min. 98 FPS. Or rather, how many modern games can Titan V push that will exceed 98 FPS on 4K with max setting (no AA)?


i would recommend to use 4K 120hz 8bit RGB SDR for desktop and all SDR games - 
and switch to 4K 98hz 10bit RGB for HDR games!

leave HDR OFF if you don't display HDR material!


----------



## guttheslayer

Sichtwechsel86 said:


> i would recommend to use 4K 120hz 8bit RGB SDR for desktop and all SDR games -
> and switch to 4K 98hz 10bit RGB for HDR games!
> 
> leave HDR OFF if you don't display HDR material!


I am not even sure what games have HDR support, but for sure I will be very annoyed to constantly switch these mode. Would rather fixed a permanent mode for 24/7.


----------



## profundido

Glerox said:


> Anybody able to play Netflix in 4K HDR with this monitor?
> 
> I tried Netflix app and Edge browser, stuck to HD.
> 
> With DP cable, 6850K, Titan XP and latest drivers/updates.


besides the obvious that was already mentioned here (have UDH subscription, etc...)

from the windows 10 store buy (it's like 1€) and download+install "hevc video extensions"

https://www.microsoft.com/en-us/p/hevc-video-extensions/9nmzlz57r3t7


See also my earlier posts where I mentioned it

edit: also for you @bee144


----------



## profundido

guttheslayer said:


> I am not even sure what games have HDR support, but for sure I will be very annoyed to constantly switch these mode. Would rather fixed a permanent mode for 24/7.


I said it in the x27 thread and I'll say it again here in order to get a hassle 24/7 config with no need for further tweaking:

1. First and while HDR in windows is disabled use the monitor nits and color profiles along with nvidia control panel to set your desired brightness level. This handles your default for all SDR content
2. Now enable HDR in windows and use it's so called "SDR brightness" slider to match your desired brightness in step 1 since this greyes out and overrides the brightness setting in the OSD. This sets your baselines for all HDR content basically and makes sure that your default brightness for HDR is the same as SDR.
2. In this modus, optionally use the OSD color profiles or custom user profile to further tune the brightness or colors. Now you're done with all defaults for HDR content
3. If a certain HDR game has extra HDR options built inside it, use those settings in order to tailor the HDR brightness (specifically for that game only) since they override the general default windows settings. This step handles the exceptions to your earlier defined general settings

The order kind of matters because if you first tailor everything to 1 specific game using in-game HDR sliders and the next game without HDR settings built-in used only the default settings from windows it will look all wrong and you will be readjusting forever like a Rubik's cube.


----------



## profundido

Baasha said:


> Interesting... I'll submit a report as well.
> 
> How'd you manage to get an invite to the BFV Closed Alpha? Was there something you had to do (fill out a form etc.)?


thanks. Will you let us know the results please ? I'm real curious to their response


----------



## Sichtwechsel86

guttheslayer said:


> I am not even sure what games have HDR support, but for sure I will be very annoyed to constantly switch these mode. Would rather fixed a permanent mode for 24/7.


yeah... it's a bit annoying

but most of the time you will run 4K 120hz 8bit RGB SDR for desktop use and most other content...
only if you start a HDR game you have to switch...
or if you watch a HDR movie

as for HDR games: you will find out if a game supports HDR or not by just check out ingame settings ---

if there is a point for HDR10/DolbyVision, etc... it supports HDR - if not - it doesn't!

also you can check out this list of HDR games for PC:

https://pcgamingwiki.com/wiki/Speci...Feature-2Fintro/outrotemplate=Feature-2Foutro

not many titles by now even have HDR support on PC

i think all upcoming AAA titles will support HDR on every platform...
BFV, Anthem, Dying Light 2, Forza H 5, Gears 5, Metro Exodus, Shadow of the Tomb Raider, The Walking Dead - Final Season.... etc etc....

if you plan on using a PS4pro or an Xbox 1 X with this monitor there are much more games which support HDR right now!

https://www.xbox.com/de-DE/games/xbox-one/xbox-one-x-enhanced-list

in general: 
if it is a AAA titel - it most likely supports HDR
if it's a AAA console exclusive title - like Uncharted, Gears 4, Last of Us, Forza, etc... - it supports HDR


----------



## acmilangr

Newegg also removed the monitor.
Did i buy the Last one?


----------



## Sichtwechsel86

acmilangr said:


> Newegg also removed the monitor.
> Did i buy the Last one?


Since Asus announced that it would be possible to update the firmware on your own later this year - alternate.de startet selling PG27UQ again!


----------



## Glerox

Sichtwechsel86 said:


> You need Netflix UHD Abo,
> 
> for HDR you have to switch HDR ON via windows,
> then open the Netflix Windows App,
> and watch HDR content like 'Altered Carbon', etc...
> 
> for me it worked perfectly...
> 
> newest windows,
> newest GPU drivers,
> 
> i7 5820K
> 16 GB DDR4 RAM
> GTX 1080 Amp Extreme connected to X27 via DP1.4
> 
> youtube HDR videos work for you???
> 
> same procedure here: first witch HDR on via windows, then open chrome and watch a youtube hdr video...
> i recommend the 'TechnoZen Gaming' channel!
> Lots of HDR Let's Plays on his channel... even his intro-video is HDR-graded!
> 
> has anyone an idea of how to record HDR-gameplay-videos and in which format one has to upload these on yt?
> couldn't find it out ... the only thing i found was an external Atom Fire capture-card, which works via HDMI and costs at around 1200€ (which is way too much)


Yup I did everything like you described and still no HDR for Netflix. Youtube works.

Will try do install HEVC extensions like Profundido said.


----------



## Kommando Kodiak

CSGO is way too dark on this monitor, there must be some setting i missing.


----------



## bee144

Kommando Kodiak said:


> CSGO is way too dark on this monitor, there must be some setting i missing.


"besides the obvious that was already mentioned here (have UDH subscription, etc...)

from the windows 10 store buy (it's like 1€) and download+install "hevc video extensions"

https://www.microsoft.com/en-us/p/he...s/9nmzlz57r3t7"

The above is a quote from profundido earlier in the thread and I believe he's correct. I'm at work but I'll buy and install the extension later today and report back if you want to wait.


----------



## Kommando Kodiak

you might not have to pay anything, thats probably in the Klite codec pack


----------



## Glerox

profundido said:


> besides the obvious that was already mentioned here (have UDH subscription, etc...)
> 
> from the windows 10 store buy (it's like 1€) and download+install "hevc video extensions"
> 
> https://www.microsoft.com/en-us/p/hevc-video-extensions/9nmzlz57r3t7
> 
> 
> See also my earlier posts where I mentioned it
> 
> edit: also for you @bee144


It works! Many thanks!

So ridiculous that you have to buy a 1 DOLLAR app in the WINDOWS STORE for 4K HDR to work LMAO


----------



## bee144

Glerox said:


> It works! Many thanks!
> 
> So ridiculous that you have to buy a 1 DOLLAR app in the WINDOWS STORE for 4K HDR to work LMAO


Same reason as to why the Blu-ray app doesn't come by default on the Xbox. Microsoft only has to pay the royalties if you actually install the feature. In some cases they offer the service to the customer for free (Blu-ray app) and eat the royalty fee and in other cases such as this, the royalty gets past on down to the end user.


----------



## Glerox

HDR Games seem to automatically output in 8bit RGB at 120Hz.
I would like to compare it with 10bits YCbCr 4:2:2 but it seems games don't let you choose what's the signal type.

Anybody saw a visual difference between 8 bit RGB and 10bits YCbCr 4:2:2 in HDR?

Thanks


----------



## Glerox

Guys, my e-peen is... EXCITED


----------



## badjz

Glerox said:


> Guys, my e-peen is... EXCITED


Nice! Do share the details, any good in HDR?


----------



## Glerox

badjz said:


> Nice! Do share the details, any good in HDR?


Yeah it's amazing. It looks real.


----------



## Baasha

Guys,

BF1 seems broken again for me.

Even with G-Sync disabled - my GPU usage has dropped to the 70% range again with the same settings as before.

There was a huge BF1 update a couple of days ago and I'm not sure if that messed things up?

In fact, it seems like many of the other games are affected as well. 

Is anyone else experiencing this?


----------



## kot0005

you guys have to see HDR games running in person..no point asking how it looks lol..but yes its more realistic than other crappy monitors in sdr..


----------



## kx11

> *Only available at Powered by Asus partners - You can't get this one at Amazon or Newegg!*



https://www.velocitymicro.com/wizard.php?iid=308


----------



## bee144

NVIDIA support is now claiming that BF1 never supported SLI because it's not listed at this link: https://www.geforce.com/games-appli...title ASC&sort_order=ASC&sort_by=title&page=2

LOL that link hasn't been updated since 2015.

I had to share their own release notes with them to prove that they claimed SLI support: http://us.download.nvidia.com/Windows/373.06/373.06-win10-win8-win7-desktop-release-notes.pdf

Also, installing the codec seemed to resolve the issue and Netflix plays in 4k HDR.


----------



## profundido

Glerox said:


> It works! Many thanks!
> 
> So ridiculous that you have to buy a 1 DOLLAR app in the WINDOWS STORE for 4K HDR to work LMAO


and @bee144

you're welcome guys, enjoy the neon lights in the Jessica Jones bar scene !  

and yes I agree it's quite ridiculous that this isn't given as part of the standard (free) windows updates or free optional download at least. But for 1€ I will not complaint or at least not right after buying this monitor...


----------



## Sichtwechsel86

profundido said:


> and @bee144
> 
> you're welcome guys, enjoy the neon lights in the Jessica Jones bar scene !
> 
> and yes I agree it's quite ridiculous that this isn't given as part of the standard (free) windows updates or free optional download at least. But for 1€ I will not complaint or at least not right after buying this monitor...


that is strange - 

never had to buy this extension - 

and everything worked for me out of the box...


----------



## profundido

Sichtwechsel86 said:


> that is strange -
> 
> never had to buy this extension -
> 
> and everything worked for me out of the box...


It's possible and makes sense. This codec extension has in the past existed sometime as a free separate codec. My guess is that somewhere in the past you have installed it as part of other software or codec package (maybe k-lite even ??) and therefore you would have never noticed the requirement at this point in time. On a clean install you would for sure.


----------



## Sichtwechsel86

profundido said:


> It's possible and makes sense. This codec extension has in the past existed sometime as a free separate codec. My guess is that somewhere in the past you have installed it as part of other software or codec package (maybe k-lite even ??) and therefore you would have never noticed the requirement at this point in time. On a clean install you would for sure.


hm... sounds right... but i never used k-lite - but maybe DVDfab5 came with it!
or VLC 3.x....?!

however - everything HDR related worked perfectly fine out of the box - 

but just via DP1.4 - via HDMI2.0 i sometimes get a purple screen when switching HDR on/off and then i have to switch back and forth until it vanishes

anyone else experienced this effect?


----------



## fun498

Sichtwechsel86 said:


> i would recommend to use 4K 120hz 8bit RGB SDR for desktop and all SDR games -
> and switch to 4K 98hz 10bit RGB for HDR games!
> 
> leave HDR OFF if you don't display HDR material!


For everyone worrying about/looking fort a solution to all this switching, I found this little program that allows you to set up profiles and switch with just a couple clicks! Hope it helps https://sourceforge.net/projects/monitorswitcher/

EDIT: For an even quicker/easier approach: the program supports command-line. So you can write a little script to toggle profiles and assign a hotkey (with your keyboard's software suite or hotkey software of choice) to run that file. I use a similar approach to switch audio devices between a USB headset and analog out to my speaker setup and it works great. 

I don't have one of these bad boys yet because my local microcenter is currently out (I was out of the country when they released and wanted to wait out and see some reviews/decide between the Acer and Asus anyway) but they said they are receiving more stock soon. I can't wait to try this out with the monitor!


----------



## pez

guttheslayer said:


> I am not even sure what games have HDR support, but for sure I will be very annoyed to constantly switch these mode. Would rather fixed a permanent mode for 24/7.


No plans personally on getting this monitor at the current price point, but I've been lurking this thread and was thinking about this. I would imagine switching this manually is pretty annoying considering TVs make this process pretty seemless (well, you're not constantly switching refresh rates--I'm moreso focused on changed HDR manually).

Outside of the windows settings, are any of these available to set in the NV control panel and link to specific games? That might alleviate some of the annoyance.


----------



## Sichtwechsel86

pez said:


> No plans personally on getting this monitor at the current price point, but I've been lurking this thread and was thinking about this. I would imagine switching this manually is pretty annoying considering TVs make this process pretty seemless (well, you're not constantly switching refresh rates--I'm moreso focused on changed HDR manually).
> 
> Outside of the windows settings, are any of these available to set in the NV control panel and link to specific games? That might alleviate some of the annoyance.


i thought about this too - but no - 
at the moment you have to switch manually 
and maybe there are tools out there that work like a universal switch for toggling between specific settings...
but out of the box the experience leaves a bit to be desired...

but again - 
i don't think to make some mouseclicks for using specific content is a big issue - 

on consoles it works, because TV-console-ecosystem is a lot more sychronized throughout specific standards...

on PC though - every app/game/player would have to automatically override specific settings within windows and gpu-driver-settings
to get the experience to a point, where the whole system auto-adjusts itself to the needs of the content...

the only way i can imagine that is that every pc game would need an option to override gpu-settings with specific settings
resolution, 
max refreshrate, 
bitdepth and videonorm (RGB, YCbCr 444/422/420)
vsync/fastsync/freesync/gsync, 
framelimit, 
HDR (HDR10, HLG, HDR10+, DolbyVision)

and i am waiting for this option to be included within games since ever...

but it would only work if running in exclusive fullscreen mode and for every other player (for movies, etc...) goes the same...

and at least, that is what happens on consoles -
you start one player in exclusive fullscreen and it overrides the base settings of console for its played content...
(base setting: 4K 60hz 8bit RGB FULL - BluRayPlayer-App on X1X: 4K 24hz 10bit 422 limited and HDR)
you start one game in exclusive fullscreen and it overrides the base settings for its played game, 
(base: 4K 60hz 8bit RGB FULL - game: 4K 60hz 10bit 422 limited HDR)

on pc that would mean, 
that every window-mode would interfere with base settings (windows and NVCP or AMD Catalyst), 
and that every fullscreen mode would make it impossible to run different apps with differrent specific settings at the same time and switch between them on-the-fly...

so i think for games it would make sense to implement override-features for exclusive fullscreen - 
and for movieplayers too...

maybe we will see that in the future!


----------



## acmilangr

Damn. courier will send me on monday the package


----------



## kot0005

bee144 said:


> NVIDIA support is now claiming that BF1 never supported SLI because it's not listed at this link: https://www.geforce.com/games-appli...title ASC&sort_order=ASC&sort_by=title&page=2
> 
> LOL that link hasn't been updated since 2015.
> 
> I had to share their own release notes with them to prove that they claimed SLI support: http://us.download.nvidia.com/Windows/373.06/373.06-win10-win8-win7-desktop-release-notes.pdf
> 
> Also, installing the codec seemed to resolve the issue and Netflix plays in 4k HDR.


Most people dont have a clue about their own company..I have seen a lot of EBgames staff who has no idea about some games or collectors edition stuff ..


----------



## guttheslayer

Sichtwechsel86 said:


> i thought about this too - but no -
> at the moment you have to switch manually
> and maybe there are tools out there that work like a universal switch for toggling between specific settings...
> but out of the box the experience leaves a bit to be desired...
> 
> but again -
> i don't think to make some mouseclicks for using specific content is a big issue -
> 
> on consoles it works, because TV-console-ecosystem is a lot more sychronized throughout specific standards...
> 
> on PC though - every app/game/player would have to automatically override specific settings within windows and gpu-driver-settings
> to get the experience to a point, where the whole system auto-adjusts itself to the needs of the content...
> 
> the only way i can imagine that is that every pc game would need an option to override gpu-settings with specific settings
> resolution,
> max refreshrate,
> bitdepth and videonorm (RGB, YCbCr 444/422/420)
> vsync/fastsync/freesync/gsync,
> framelimit,
> HDR (HDR10, HLG, HDR10+, DolbyVision)
> 
> and i am waiting for this option to be included within games since ever...
> 
> but it would only work if running in exclusive fullscreen mode and for every other player (for movies, etc...) goes the same...
> 
> and at least, that is what happens on consoles -
> you start one player in exclusive fullscreen and it overrides the base settings of console for its played content...
> (base setting: 4K 60hz 8bit RGB FULL - BluRayPlayer-App on X1X: 4K 24hz 10bit 422 limited and HDR)
> you start one game in exclusive fullscreen and it overrides the base settings for its played game,
> (base: 4K 60hz 8bit RGB FULL - game: 4K 60hz 10bit 422 limited HDR)
> 
> on pc that would mean,
> that every window-mode would interfere with base settings (windows and NVCP or AMD Catalyst),
> and that every fullscreen mode would make it impossible to run different apps with differrent specific settings at the same time and switch between them on-the-fly...
> 
> so i think for games it would make sense to implement override-features for exclusive fullscreen -
> and for movieplayers too...
> 
> maybe we will see that in the future!



I think just set it to 98Hz in HDR10 4:4:4 and G-sync on and leave it to that for 24/7 if I had the monitor. It is because no single GPU currently have the amount of horsepower to saturate DP 1.4, (since the 98 Hz just did that) it will be good to leave it as it is.


I heard switching to HDR10 alone can also drop FPS performance.


----------



## Sichtwechsel86

guttheslayer said:


> I think just set it to 98Hz in HDR10 4:4:4 and G-sync on and leave it to that for 24/7 if I had the monitor. It is because no single GPU currently have the amount of horsepower to saturate DP 1.4, (since the 98 Hz just did that) it will be good to leave it as it is.
> 
> 
> I heard switching to HDR10 alone can also drop FPS performance.


if you leave it there you will be using rec2020 all time - 
leading to mismatched colors should you try to display sRGB content!

at least switch HDR on/off based on the content you use!


----------



## Sichtwechsel86

Oh by the way...

on the Acer X27 is an option to show the actual refreshrate of the monitor as a yellow number in the right upper corner...
while using gsync it works like a fps-counter...

is there an option for that on the PG27UQ too?


----------



## Bloodmosher

Bloodmosher said:


> After a week of testing, here are my thoughts on this monitor:
> 1. BF1 in HDR is so good, I have a hard time playing in SDR
> 2. In BF1, I cannot see any noticeable difference between 98hz HDR 10-bit and 120hz HDR 8-bit w/dithering, so I play at 120hz
> 3. With a pair of Titan XPs in SLI I can get 95-105 fps on complex multiplayer maps, HDR 120hz Ultra everything except anti-aliasing post, which I leave at FXAA High
> 4. On the desktop, the only way I can match white brightness levels of my other monitor (PG27AQ) is to enable HDR and crank up the reference white setting to 100 nits. In SDR it is just a little dimmer than I would like.
> 5. Out of the box the colors are way too yellow, but this seems to be true for every monitor I buy.
> 6. With HDR on, Chrome is gray, and the only way to fix it seems to be to disable hardware acceleration in Chrome. But this makes Chrome unusable in my opinion. Chrome needs to be fixed.
> 7. No panel is ever perfect in my experience, and this one seems to have some consistency issues, which for me is especially noticeable on white backgrounds (most of the web, Outlook, Excel, etc.). Basically, the lower half of the monitor looks more yellow than the top half when viewing whites. I haven't yet decided whether this is annoying enough to try to return/exchange. Like I said, no panel is ever perfect and all my other monitors have some similar consistency issue.
> 8. My settings are: HDR on, Nits 100, Contrast 55, (r,g,b): 53,53,100, Windows SDR brightness: 46
> 
> Anyone else notice the consistency issue I see in #7? Also, if you have settings you consider ideal, I would love to know what you are using.


This took a few days for a moderator to approve so may have been lost in the shuffle- is anyone else noticing color consistency issues, with white in particular as described above? If so does it bother you enough to return/exchange or not?


----------



## acmilangr

https://gzhls.at/i/13/46/1571346-n0.jpg
Is there any way to find this background wallpaper?


----------



## acmilangr

Bloodmosher said:


> This took a few days for a moderator to approve so may have been lost in the shuffle- is anyone else noticing color consistency issues, with white in particular as described above? If so does it bother you enough to return/exchange or not?


This is good question. 
I will check it on monday that i will get it


----------



## bee144

NVIDIA has moved the BF1 G-Sync SLI issue to tier 2 support. Said it might be awhile till we hear back.


----------



## Glerox

bee144 said:


> NVIDIA has moved the BF1 G-Sync SLI issue to tier 2 support. Said it might be awhile till we hear back.


Thanks for contacting them


----------



## Glerox

I tried 4K HDR capture with shadowplay :






It captures oversaturated SDR instead 

Does anybody know how to capture HDR gaming apart from a 1000$ video capture card?


----------



## kx11

Glerox said:


> I tried 4K HDR capture with shadowplay :
> 
> https://www.youtube.com/watch?v=euaAhuYF2F4&lc=z224u5gbbvbzd5gxp04t1aokgznhl3accncappyvlaperk0h00410
> 
> It captures oversaturated SDR instead
> 
> Does anybody know how to capture HDR gaming apart from a 1000$ video capture card?



there's a 300$ capture card 




https://www.newegg.com/Product/Prod...vermedia_live_gamer_4k-_-15-100-185-_-Product




it's 400$ now but when bought it 3 days ago it was 300$


----------



## HyperMatrix

BestBuy.ca cancelled the PG27UQ orders, sent out $100 e-certificates as compensation. It's no longer listed on their website either.

_"Thanks for placing an order for ASUS ROG 27" 4K UHD 144Hz 4ms GTG IPS LED G-Sync Gaming Monitor(PG27UQ)-Plasma Copper/Armor Titanium. However, despite our best efforts we weren't able to secure inventory for the product so we've been forced to cancel the order. 
We're very sorry for this, and we'll make sure your method of payment is not charged.
To apologize we'd like to offer you a $100 eCertificate, which you can use on almost any future purchase at BestBuy.ca. In order to use your eCertificate you must have an account on BestBuy.ca and a valid credit card*."_


----------



## Sichtwechsel86

Today i got my PG27UQ

and i have to say - 

i am impressed - 
compared to my X27 it has less Backlightbleed, 
therefor better FALD and less haloing,
the more intuitive menu...

also my model already has an updated FW and no Blackcrush in 144hz...

i would definitely keep this one - 
but unfortunatley... 
it has many many dead pixels...

i counted 18 (!!!) so far...

some of them are dead black dots, some of them seem to be ON all time and some have subpixel issues...

so i have to return it again - 
but from what i now saw: i gotta stick with PG27UQ and hopefully my next panel won't have defect pixels and maybe even less BLB than this one i got!

Again: 
it could very well be - that my ACER X27 was a very bad model - 
but now that i had the chance to inspect both models - i go with Asus...

sooo...

once more - return, exchange...
3rd GSync HDR monitor then - 2nd PG27UQ - wish me luck!


----------



## Glerox

Sichtwechsel86 said:


> Today i got my PG27UQ
> 
> and i have to say -
> 
> i am impressed -
> compared to my X27 it has less Backlightbleed,
> therefor better FALD and less haloing,
> the more intuitive menu...
> 
> also my model already has an updated FW and no Blackcrush in 144hz...
> 
> i would definitely keep this one -
> but unfortunatley...
> it has many many dead pixels...
> 
> i counted 18 (!!!) so far...
> 
> some of them are dead black dots, some of them seem to be ON all time and some have subpixel issues...
> 
> so i have to return it again -
> but from what i now saw: i gotta stick with PG27UQ and hopefully my next panel won't have defect pixels and maybe even less BLB than this one i got!
> 
> Again:
> it could very well be - that my ACER X27 was a very bad model -
> but now that i had the chance to inspect both models - i go with Asus...
> 
> sooo...
> 
> once more - return, exchange...
> 3rd GSync HDR monitor then - 2nd PG27UQ - wish me luck!


Damn you're so unlucky.
No problem with mine.
I hope they'll release the firmware update tool soon for current owners.


----------



## Glerox

kx11 said:


> there's a 300$ capture card
> 
> 
> 
> 
> https://www.newegg.com/Product/Prod...vermedia_live_gamer_4k-_-15-100-185-_-Product
> 
> 
> 
> 
> it's 400$ now but when bought it 3 days ago it was 300$


interesting thanks


----------



## acmilangr

Glerox said:


> interesting thanks


On monday i will got mine. 
Could you please tell me What is rgia Black crash you are mention? How to test that? I want to know if mine will have the latest firmware or not


----------



## Glerox

acmilangr said:


> On monday i will got mine.
> Could you please tell me What is rgia Black crash you are mention? How to test that? I want to know if mine will have the latest firmware or not


Put your monitor in 144Hz SDR mode.
Go on this page : http://www.lagom.nl/lcd-test/black.php

You're supposed to see every square with different shades of gray.
If you see that some are all black, you have black crush, hence the firmware is not updated.


----------



## acmilangr

Glerox said:


> Put your monitor in 144Hz SDR mode.
> Go on this page : http://www.lagom.nl/lcd-test/black.php
> 
> You're supposed to see every square with different shades of gray.
> If you see that some are all black, you have black crush, hence the firmware is not updated.


Thanks alot.


----------



## jesyjames

Glerox said:


> Put your monitor in 144Hz SDR mode.
> Go on this page : http://www.lagom.nl/lcd-test/black.php
> 
> You're supposed to see every square with different shades of gray.
> If you see that some are all black, you have black crush, hence the firmware is not updated.


It's worth noting that I was able to make an ICC profile and alleviate much of the crush while we wait for the firmware.


----------



## saltedham

Glerox said:


> Put your monitor in 144Hz SDR mode.
> Go on this page : http://www.lagom.nl/lcd-test/black.php
> 
> You're supposed to see every square with different shades of gray.
> If you see that some are all black, you have black crush, hence the firmware is not updated.


i got black crush. 144 sdr. not a biggie to me cause i have it at 120hz so i get the 444colors



what do you guys have your gamma set to, is 2.2 the best?


----------



## Sichtwechsel86

saltedham said:


> i got black crush. 144 sdr. not a biggie to me cause i have it at 120hz so i get the 444colors
> 
> 
> 
> what do you guys have your gamma set to, is 2.2 the best?


i would stick with 2.2...

it's how the display was pre-calibrated for SDR sRGB...

tell me people, what color temp do you prefer and why? (cold, normal, warm)


----------



## saltedham

Sichtwechsel86 said:


> i would stick with 2.2...
> 
> it's how the display was pre-calibrated for SDR sRGB...
> 
> tell me people, what color temp do you prefer and why? (cold, normal, warm)


im using warm. it feels better on my eyes. im not so sure i like hdr. it feels like its blinding me sometimes


----------



## Sichtwechsel86

saltedham said:


> im using warm. it feels better on my eyes. im not so sure i like hdr. it feels like its blinding me sometimes


yeah - i know what you mean...

there should be an option to limit the maximum brightness in HDR - 

i have watched a BFV HDR video - and i couldn't watch the whole 5 minutes - because my eyes were hurting...
(all the snow and the sun - yeah it's blinding me in real life too, but then i wear sunglasses...! do i really need to wear sunglasses in front of my display?)

and from a perspective of a medical student - high brightness straight to your retina is not good at all!
that's why most people lower the brightness and choose warm color temp or even blue-light-filter

even knowing it's better - i choose normal color temp - because i really don't like the yellowish whites - 
somehow it feels wrong - 
and after some time i have the feeling to watch through a colored lense...

i like white to be white... not yellow-white

also i have asked myself...
movies are produced with 6500k white point in mind - right?
so what's more true to the source-material? using color temp warm or normal?
with color temp warm - the whitepoint should be around 6500K - with normal a bit colder...
but how does this effect the source material?


----------



## kot0005

saltedham said:


> im using warm. it feels better on my eyes. im not so sure i like hdr. it feels like its blinding me sometimes





Sichtwechsel86 said:


> yeah - i know what you mean...
> 
> there should be an option to limit the maximum brightness in HDR -
> 
> i have watched a BFV HDR video - and i couldn't watch the whole 5 minutes - because my eyes were hurting...
> (all the snow and the sun - yeah it's blinding me in real life too, but then i wear sunglasses...! do i really need to wear sunglasses in front of my display?)
> 
> and from a perspective of a medical student - high brightness straight to your retina is not good at all!
> that's why most people lower the brightness and choose warm color temp or even blue-light-filter
> 
> even knowing it's better - i choose normal color temp - because i really don't like the yellowish whites -
> somehow it feels wrong -
> and after some time i have the feeling to watch through a colored lense...
> 
> i like white to be white... not yellow-white
> 
> also i have asked myself...
> movies are produced with 6500k white point in mind - right?
> so what's more true to the source-material? using color temp warm or normal?
> with color temp warm - the whitepoint should be around 6500K - with normal a bit colder...
> but how does this effect the source material?


I thion its just bad implimentation...I like HDR but it needs tweaking, Like in SWBF2 when you open star cards they literally flash like a camera's flash. Sometimes during loading screens you get a pure white screen at 1000nits for a few milli seconds.. same with Destiny, white screen with 1k nits during some load screen transition.


----------



## Bloodmosher

acmilangr said:


> This is good question.
> I will check it on monday that i will get it


My PG279Q is far better for desktop use - it is brighter in SDR; the only way I can get the PG27U to have as bright whites is to turn on HDR; but that breaks Chrome - it shows gray unless you disabled hardware acceleration which makes Chrome unusable. And the white consistency issue is starting to drive me crazy; I've reached out to ASUS to see if they'll swap this thing. If not, I may return. It's a tough one - it is sooo good for BFV, but I was really hoping to have my center monitor be "the" one for both desktop and games.


----------



## kot0005

Bloodmosher said:


> My PG279Q is far better for desktop use - it is brighter in SDR; the only way I can get the PG27U to have as bright whites is to turn on HDR; but that breaks Chrome - it shows gray unless you disabled hardware acceleration which makes Chrome unusable. And the white consistency issue is starting to drive me crazy; I've reached out to ASUS to see if they'll swap this thing. If not, I may return. It's a tough one - it is sooo good for BFV, but I was really hoping to have my center monitor be "the" one for both desktop and games.


PG279q sucks in every aspect. The desktop is super bright in SDR, i had to reduce brightness to 60..


Does anyone else have a thicker black border on the top on the panel compared to the sides or is it just mine ? not talking about bezels here but the black zones on all sidez of the panel.


----------



## Sichtwechsel86

kot0005 said:


> PG279q sucks in every aspect. The desktop is super bright in SDR, i had to reduce brightness to 60..
> 
> 
> Does anyone else have a thicker black border on the top on the panel compared to the sides or is it just mine ? not talking about bezels here but the black zones on all sidez of the panel.


my PG27UQ has nearly perfect brightness-uniformity!

i think it's a defect on your unit...!

i am so sad about the dead pixels on my unit - in terms of all other factors this particular unit is as good as it can be...
just a hint of BLB on the right side, FALD working good and fast, colors, everything is fine...

if not these dead pixels would be right in the center... :/


----------



## kot0005

Sichtwechsel86 said:


> my PG27UQ has nearly perfect brightness-uniformity!
> 
> i think it's a defect on your unit...!
> 
> i am so sad about the dead pixels on my unit - in terms of all other factors this particular unit is as good as it can be...
> just a hint of BLB on the right side, FALD working good and fast, colors, everything is fine...
> 
> if not these dead pixels would be right in the center... :/


lol u shud read my post again xD, I am talking about the black border on the top. I dont have uniformity issues.

The black border on the top for me measures around 3-4.5mm


----------



## Babryn25

Sichtwechsel86 said:


> my PG27UQ has nearly perfect brightness-uniformity!
> 
> i think it's a defect on your unit...!
> 
> i am so sad about the dead pixels on my unit - in terms of all other factors this particular unit is as good as it can be...
> just a hint of BLB on the right side, FALD working good and fast, colors, everything is fine...
> 
> if not these dead pixels would be right in the center... :/


I think you have my unit! I replaced X27 with PG27UQ. Everything is so much better than X27 in all regards. Slight BLB on right side and two spots with dust or bad pixels (not sure which one) in the middle. I am so depressed right now.


----------



## acmilangr

Hello.
I just got mine

Some questions please:

on Nvidia Control panel what is the best to choose on output color format? "RGB" or "444"?
if i choose "444" then it has "only limited" option on "output dynamic range"

What is the option on OSD to activate/diactivate FALD?


----------



## pez

Sichtwechsel86 said:


> i thought about this too - but no -
> at the moment you have to switch manually
> and maybe there are tools out there that work like a universal switch for toggling between specific settings...
> but out of the box the experience leaves a bit to be desired...
> 
> but again -
> i don't think to make some mouseclicks for using specific content is a big issue -
> 
> on consoles it works, because TV-console-ecosystem is a lot more sychronized throughout specific standards...
> 
> on PC though - every app/game/player would have to automatically override specific settings within windows and gpu-driver-settings
> to get the experience to a point, where the whole system auto-adjusts itself to the needs of the content...
> 
> the only way i can imagine that is that every pc game would need an option to override gpu-settings with specific settings
> resolution,
> max refreshrate,
> bitdepth and videonorm (RGB, YCbCr 444/422/420)
> vsync/fastsync/freesync/gsync,
> framelimit,
> HDR (HDR10, HLG, HDR10+, DolbyVision)
> 
> and i am waiting for this option to be included within games since ever...
> 
> but it would only work if running in exclusive fullscreen mode and for every other player (for movies, etc...) goes the same...
> 
> and at least, that is what happens on consoles -
> you start one player in exclusive fullscreen and it overrides the base settings of console for its played content...
> (base setting: 4K 60hz 8bit RGB FULL - BluRayPlayer-App on X1X: 4K 24hz 10bit 422 limited and HDR)
> you start one game in exclusive fullscreen and it overrides the base settings for its played game,
> (base: 4K 60hz 8bit RGB FULL - game: 4K 60hz 10bit 422 limited HDR)
> 
> on pc that would mean,
> that every window-mode would interfere with base settings (windows and NVCP or AMD Catalyst),
> and that every fullscreen mode would make it impossible to run different apps with differrent specific settings at the same time and switch between them on-the-fly...
> 
> so i think for games it would make sense to implement override-features for exclusive fullscreen -
> and for movieplayers too...
> 
> maybe we will see that in the future!


Yeah, these are the growing pains I expected as I knew about hte monitor and ultimately followed the news until its' release. A tool to allow a simple 'double-click' icon on the desktop for setting changes would be pretty fantastic. That or make it easy to choose it as a 'preset' on the monitor itself.



guttheslayer said:


> I think just set it to 98Hz in HDR10 4:4:4 and G-sync on and leave it to that for 24/7 if I had the monitor. It is because no single GPU currently have the amount of horsepower to saturate DP 1.4, (since the 98 Hz just did that) it will be good to leave it as it is.
> 
> 
> I heard switching to HDR10 alone can also drop FPS performance.


The issue is that you'll get washed out colors this way for non-HDR content. There's seemingly a con for each 'solution', but not something I think that will have to be compromise on forever.


----------



## Sichtwechsel86

pez said:


> A tool to allow a simple 'double-click' icon on the desktop for setting changes would be pretty fantastic. That or make it easy to choose it as a 'preset' on the monitor itself.


A preset in the GPU driver (Nvidia Control Panel) would be nice... like some user profiles in mouse- and sounddrivers...

another thing:

i was informed earlier, that all new units have new firmware - but today the serviceman told me - 
they just decided to sell the units they had in stock - because Asus announced, that new FWs will be available for endusers later this year...
AAAND: Asus is STILL working on this updated FW, which - to his info - means: it's not done yet!


----------



## bee144

Sichtwechsel86 said:


> A preset in the GPU driver (Nvidia Control Panel) would be nice... like some user profiles in mouse- and sounddrivers...
> 
> another thing:
> 
> i was informed earlier, that all new units have new firmware - but today the serviceman told me -
> they just decided to sell the units they had in stock - because Asus announced, that new FWs will be available for endusers later this year...
> AAAND: Asus is STILL working on this updated FW, which - to his info - means: it's not done yet!


That's fine with me. The monitors feel 95% complete. I'd rather they spend some time and release a firmware in fall/winter of 2018 and really ensure they get the firmware as close to 100% as possible. It's a great monitor, just needs a firmware update for the bugs.


Also, not to get ahead of ourselves but since it appears the ASUS has a user flash-able firmware, does that mean we could potentially flash the monitor to DP 1.5? *Starts dreaming*


----------



## pez

Sichtwechsel86 said:


> A preset in the GPU driver (Nvidia Control Panel) would be nice... like some user profiles in mouse- and sounddrivers...
> 
> another thing:
> 
> i was informed earlier, that all new units have new firmware - but today the serviceman told me -
> they just decided to sell the units they had in stock - because Asus announced, that new FWs will be available for endusers later this year...
> AAAND: Asus is STILL working on this updated FW, which - to his info - means: it's not done yet!


A preset you could set and link based on the games would be ideal I think. If I saw that, I think I would actually start considering a monitor of this caliber.


----------



## Glerox

Guys, two problems I found and would like to have your thoughts about it.

1. As someone else posted on the X27 thread, The Lagom gamma calibration test is way off 2.2. 
http://www.lagom.nl/lcd-test/gamma_calibration.php
Can anyone else confirms this? Why is the gamma so off?

2. A new problem I found is when enabling 144Hz in HDR (instead of 120Hz or 98Hz), it lowers the framerate by a good 5-10fps! I tested BF1 and Farcry5 and both games show the same results.
It's like if chroma subsampling requires gpu resources and lowers the framerate!
It's really counter-productive because why bother increase the refresh rate of the monitor if it lowers the framerate of the game...
If someone could also confirm this, would be nice.


----------



## CallsignVega

Well I'll be damned. Wouldn't have believed it unless I tested it. On average, between AC Origins and RE7, using YCbCr 4:2:2 with my Titan V caused a ~7% performance loss versus RGB or 4:4:4. I take it there is some processing the GPU is having to do to convert (HDR) since basically everything PC is full chroma. 

I guess you could still use 144 Hz for SDR games if you have the updated firmware. With this new information 98 Hz RGB Full 10-bit should be the proper HDR mode.


----------



## acmilangr

Is there any difference between YCbCr 4:4:4 and RGB? what is better?

Also why on YCbCr 4:4:4 it has only limited output dynamic range option?


----------



## Monstieur

*4:4:4 is not RGB*



acmilangr said:


> Is there any difference between YCbCr 4:4:4 and RGB? what is better?
> 
> Also why on YCbCr 4:4:4 it has only limited output dynamic range option?


YCbCr444 is inferior to RGB in both 8-bit and 10-bit mode. They are completely different colour formats. YCbCr is limited range by design. Limited range was only added as an option in RGB because of crappy TVs that didn't map RGB to full range. Subsampling is inherently impossible with RGB.

There is no need to output 10-bit to the monitor even for HDR10. It's sufficient to run 8-bit RGB with dithering. Only the game needs to render to a 10-bit DirectX 11 surface. However if the game (incorrectly) switches to to 8-bit mode when HDR is enabled because the monitor is in 8-bit mode, you will have to run the monitor in 10-bit YCbCr422 to force the game to render in 10-bit to prevent banding.


----------



## Glerox

CallsignVega said:


> Well I'll be damned. Wouldn't have believed it unless I tested it. On average, between AC Origins and RE7, using YCbCr 4:2:2 with my Titan V caused a ~7% performance loss versus RGB or 4:4:4. I take it there is some processing the GPU is having to do to convert (HDR) since basically everything PC is full chroma.
> 
> I guess you could still use 144 Hz for SDR games if you have the updated firmware. With this new information 98 Hz RGB Full 10-bit should be the proper HDR mode.


Yup, quite deceiving... or you could use HDR 8bits RGB at 120fps but than you would get less colors and banding (I still have to test if I really see a difference between 8bits and 10bits HDR)


----------



## kot0005

pez said:


> Yeah, these are the growing pains I expected as I knew about hte monitor and ultimately followed the news until its' release. A tool to allow a simple 'double-click' icon on the desktop for setting changes would be pretty fantastic. That or make it easy to choose it as a 'preset' on the monitor itself.
> 
> 
> 
> The issue is that you'll get washed out colors this way for non-HDR content. There's seemingly a con for each 'solution', but not something I think that will have to be compromise on forever.


You have no clue what you are saying!! plz stop misinforming people. The monitor will auto enable HDR if it detects HDR signal, otherwise you are in SDR mode. Its all done automatically. You will not get washed out colors because it will switch to sdr..


I just tested my friend's series 8 Samsung 55 inch Tv on Xbox one X, This monitor is way superior in terms of lighting, the Tv's lighting wasn't that bright at all and did not look as realistic as on my PG27UQ. Colours were more or less the same.


----------



## acmilangr

is it only me that feels TOO bright the HDR ?


----------



## pez

kot0005 said:


> You have no clue what you are saying!! plz stop misinforming people. The monitor will auto enable HDR if it detects HDR signal, otherwise you are in SDR mode. Its all done automatically. You will not get washed out colors because it will switch to sdr..
> 
> 
> I just tested my friend's series 8 Samsung 55 inch Tv on Xbox one X, This monitor is way superior in terms of lighting, the Tv's lighting wasn't that bright at all and did not look as realistic as on my PG27UQ. Colours were more or less the same.


Ok, well to address two things:

We were talking about displays being explicitly set to HDR, (i.e. not auto) HDR makes SDR content look washed out. I'm not talking about this monitor directly, but addressing the issue of why it's not smart to force HDR on versus auto or other.

Second, why are you bringing up a 2 year old TV? This monitor is $2k and was released in 2018....it very well should have better HDR performance...


----------



## jesyjames

Anyone had a chance to measure the input lag on this yet? I'm curious about it's input lag with/without dimming enabled and, related, sdr vs hdr.


----------



## kot0005

pez said:


> Ok, well to address two things:
> 
> We were talking about displays being explicitly set to HDR, (i.e. not auto) HDR makes SDR content look washed out. I'm not talking about this monitor directly, but addressing the issue of why it's not smart to force HDR on versus auto or other.
> 
> Second, why are you bringing up a 2 year old TV? This monitor is $2k and was released in 2018....it very well should have better HDR performance...


ahh right, yes enabling HDR in windows is not worth it.



I dunno because that TV is rated at 1000nits too but not as impressive..have TV's gotten better now ? I saw some videos from Sweclockers and the blooming on one of the TV's they showed comparing to UQ was really bad.


----------



## acmilangr

Anyone tried Hdr/60fps video?
I have big stutterings like frame drops like my pc cannot handle it. 

But my pc is fine. 8600k/16GB RAM/1080ti.

This happens with VLC and mpc-hc


----------



## CallsignVega

acmilangr said:


> is it only me that feels TOO bright the HDR ?


Make sure you are using 80 nits as the HDR white reference point. Anything else is wrong.


----------



## acmilangr

CallsignVega said:


> Make sure you are using 80 nits as the HDR white reference point. Anything else is wrong.


Yes. It is much better not. Thanks 

How did you set the contrast?


----------



## pez

kot0005 said:


> ahh right, yes enabling HDR in windows is not worth it.
> 
> 
> 
> I dunno because that TV is rated at 1000nits too but not as impressive..have TV's gotten better now ? I saw some videos from Sweclockers and the blooming on one of the TV's they showed comparing to UQ was really bad.


I'm definitely not going to claim any expertise on HDR (or displays for that matter), but there's a lot of things that influence TV's performance with HDR. Number of zones, peak brightness, etc. Monitors are usually held to a much higher standard than TVs, but TVs have been getting consistently better throughout the years.


----------



## Gary2015

Should I get this or the Acer?


----------



## Sichtwechsel86

Gary2015 said:


> Should I get this or the Acer?


ich had the Acer X27 and had to send it back - due to serious backlightbleed, dead pixels, a very loud fan, etc...

now i have an Asus PG27UQ and it's better in every way - i would keep it - but i have a unit with dead pixels - so i have to recall this one too...
but after i got my money back i will order it again - maybe the next unit will be without dead pixels...


----------



## bee144

Gary2015 said:


> Should I get this or the Acer?


I'm loving my PG27UQ, as long as ASUS follows through with their promise to fix crushing blacks in a firmware update later this year.


----------



## acmilangr

Sorry that i am asking that again.
Anyone tried play 4k/HDR/60fps video?i havent find solution to my problem


----------



## Sichtwechsel86

acmilangr said:


> Sorry that i am asking that again.
> Anyone tried play 4k/HDR/60fps video?i havent find solution to my problem


i only have one 4K HDR 60fps movie - and that is 'Billy Lynn'

no problem with DvDfab5 

newest windows and newest drivers installed

what 4K HDR 60fps videos do you have??


----------



## Glerox

To follow-up on the 144Hz HDR lower fps issue which prevents me to use that mode until a firmware/driver update,

I challenge somebody to find a visual difference between 8 bits RGB full [email protected] Hz and 10 bits RGB full [email protected] Hz.
I looked hard to find one in Bf1 and Farcry 5 and could not find any.
There is no banding at all even in 8 bits.
However, to test color volume I would need proper testing hardware.

I can't explain why but if there is no visual difference, than I see no reason to be stuck at 98Hz if you can do 120Hz.


----------



## MistaSparkul

Glerox said:


> To follow-up on the 144Hz HDR lower fps issue which prevents me to use that mode until a firmware/driver update,
> 
> I challenge somebody to find a visual difference between 8 bits RGB full [email protected] Hz and 10 bits RGB full [email protected] Hz.
> I looked hard to find one in Bf1 and Farcry 5 and could not find any.
> There is no banding at all even in 8 bits.
> However, to test color volume I would need proper testing hardware.
> 
> I can't explain why but if there is no visual difference, than I see no reason to be stuck at 98Hz if you can do 120Hz.


Perhaps the display is automatically switching from 8 bit to 10 bit when receiving an HDR signal?


----------



## Glerox

MistaSparkul said:


> Perhaps the display is automatically switching from 8 bit to 10 bit when receiving an HDR signal?


Nope in the OSD you can see the display mode details and at 120hz it's 8bits RGB full because there is not enough bandwith for 10bits RGB full


----------



## CallsignVega

The increased bit range would only be seen in HDR highlights. Did you check sunsets etc to see if you could see the difference?


----------



## Glerox

CallsignVega said:


> The increased bit range would only be seen in HDR highlights. Did you check sunsets etc to see if you could see the difference?


Yup, specifically the sun. I see really light banding but it's the same between 8bits and 10bits RGB full.

I tried to show pictures but for a reason it fails to upload.


----------



## Glerox

You can test it yourself with this gradient pattern using windows photos app: https://drive.google.com/file/d/0B68jIlCvW85gWFp0NVU***dTNFE/view
The top is 8 bits, the bottom is 10 bits.

In SDR, you can see banding in both parts (bottom is worst) and 8bits vs 10bits in NVCP doesn't change anything. 
In HDR, the banding in the top part disappears. 8bits vs 10bits still doesn't change anything.

Each time I switch mode, I verify that my monitor is receiving the intended signal in the OSD and also you need to re-open the video each time.

So once again it seems once HDR is activated, banding is reduced and whether you're sending a 8bits or 10bits signal doesn't change anything...
I guess we'll have to wait for professional reviews.


----------



## kx11

there's an app in windows store called HDRtest or something , you guy should check it out


----------



## Glerox

kx11 said:


> there's an app in windows store called HDRtest or something , you guy should check it out


Thanks! That was exactly the tool I was looking for! It's called DisplayHDR test.
To my eye, there is still no difference in banding between HDR 8 bits or 10 bits.
There is light banding visible with both but it stays the same.

Maybe the 8bits+FRC is applied even if the signal is only 8bits?

Will have to try the test on my OLED if we still see light banding with a true 10bit display.


----------



## kx11

speaking of the FW update asus commented on optional FW update coming later this year



https://www.asus.com/us/support/FAQ/1036750


----------



## acmilangr

Sichtwechsel86 said:


> i only have one 4K HDR 60fps movie - and that is 'Billy Lynn'
> 
> no problem with DvDfab5
> 
> newest windows and newest drivers installed
> 
> what 4K HDR 60fps videos do you have??


Many from this site like chess 4k demo 
http://4kmedia.org


----------



## Sichtwechsel86

acmilangr said:


> Many from this site like chess 4k demo
> http://4kmedia.org


oh yeah... i know this site...

some of these videos use poor coding and some of them are meant to be only used on specific TVs - 
VLC and MP-C with it's many filters, implemented codecs, etc... have a hard time to play back some of these files...

which video specificially do you have problems with?

i would try it on DvDfab5 with you - 
which often plays back videos VLC and MP-C are struggeling with...


----------



## acmilangr

Sichtwechsel86 said:


> oh yeah... i know this site...
> 
> some of these videos use poor coding and some of them are meant to be only used on specific TVs -
> VLC and MP-C with it's many filters, implemented codecs, etc... have a hard time to play back some of these files...
> 
> which video specificially do you have problems with?
> 
> i would try it on DvDfab5 with you -
> which often plays back videos VLC and MP-C are struggeling with...


Try chess 4k demo please


----------



## Sichtwechsel86

acmilangr said:


> Try chess 4k demo please


tried it - with and without HDR...

on VLC v3.0.3 i get some minor stutters - but not much and just at some scenes...

on DvDfab5 no stutters at all...

System:

GamingNotebook with 

i7 7700HQ
16Gb DDR4 
GTX 1060 connected to PG27UQ via DP1.4
256 Gb SSD (Sata! not m.2 pci-e 4x)

You seem to have a better system!
So i think the issues you are confronted with are based on software/drivers/codecs - not your hardware - the hardware is powerful enough!


----------



## acmilangr

Sichtwechsel86 said:


> tried it - with and without HDR...
> 
> on VLC v3.0.3 i get some minor stutters - but not much and just at some scenes...
> 
> on DvDfab5 no stutters at all...
> 
> System:
> 
> GamingNotebook with
> 
> i7 7700HQ
> 16Gb DDR4
> GTX 1060 connected to PG27UQ via DP1.4
> 256 Gb SSD (Sata! not m.2 pci-e 4x)
> 
> You seem to have a better system!
> So i think the issues you are confronted with are based on software/drivers/codecs - not your hardware - the hardware is powerful enough!


I'll give a try to DvDfab5

Edit: i tried it and it worked perfectly.


----------



## Glerox

So I tested DisplayHDR test (official VESA tool) on my LG OLED e6 (true 10bits) to compare it with the Asus PG27UQ (8bits + FRC) and I got interesting results.
All tests were done with RGB Full pipeline except the 10bits signal on the OLED (YCbCr 4:2:2)

In SDR mode sending a 8bits signal, the 8bits quantization pattern and 10bits quantization pattern are identical with a lot of banding.
In SDR mode sending a 10bits signal, the 10bits quantization pattern is way better (banding is really light).
In SDR mode it's pretty much the same banding on the OLED vs PG27UQ.

So everything good so far.
In HDR mode, it gets tricky,

In HDR mode sending a 8bits signal, the 10bits quantization pattern is way better (banding is really light).
In HDR mode sending a 10bits signal, the 10bits quantization pattern is way better (banding is really light).
In HDR mode it's pretty much the same banding on the OLED vs PG27UQ.

So basically, once HDR is activated, to my eye, there is NO difference between 8bits and 10bits... neither on the OLED, neither on the PG27UQ
This would explains why I couldn't see any differences between [email protected] and [email protected] while gaming in HDR

I checked many times that the signal was really only 8 bits in the OSD and in the app.

It's interesting I just can't explain why lol.


----------



## Sichtwechsel86

acmilangr said:


> Edit: i tried it and it worked perfectly.


good to know... 

i think PowerDVD xx ULTRA would have worked too...


----------



## acmilangr

Sichtwechsel86 said:


> good to know...
> 
> i think PowerDVD xx ULTRA would have worked too...


I tried powerdvd 18 and it has strange issue. It auto recognise HDR, monitor goes on HDR Mode but the image is washed out. Even if I change it manually on Windows 10 options it has the same result.

Formating now installing Clear Windows 10

Edit:after fresh Windows installation now also powerdvd works


----------



## CallsignVega

Glerox said:


> So I tested DisplayHDR test (official VESA tool) on my LG OLED e6 (true 10bits) to compare it with the Asus PG27UQ (8bits + FRC) and I got interesting results.
> All tests were done with RGB Full pipeline except the 10bits HDR signal on the OLED (YCbCr 4:2:2)
> 
> In SDR mode sending a 8bits signal, the 8bits quantization pattern and 10bits quantization pattern are identical with a lot of banding.
> In SDR mode sending a 10bits signal, the 10bits quantization pattern is way better (banding is really light).
> In SDR mode it's pretty much the same banding on the OLED vs PG27UQ.
> 
> So everything good so far.
> In HDR mode, it gets tricky,
> 
> In HDR mode sending a 8bits signal, the 10bits quantization pattern is way better (banding is really light).
> In HDR mode sending a 10bits signal, the 10bits quantization pattern is way better (banding is really light).
> In HDR mode it's pretty much the same banding on the OLED vs PG27UQ.
> 
> So basically, once HDR is activated, to my eye, there is NO difference between 8bits and 10bits... neither on the OLED, neither on the PG27UQ
> This would explains why I couldn't see any differences between [email protected] and [email protected] while gaming in HDR
> 
> I checked many times that the signal was really only 8 bits in the OSD and in the app.
> 
> It's interesting I just can't explain why lol.


That is interesting. I'm not sure how the panel being 8-bit+FRC plays into it. OLED panels are native 10-bit, but from what I've read from TFTCentral, you basically cannot tell the difference between 8-bit+FRC and real 10-bit. Maybe if TFTCentral gets their hands on one of these they would be able to explain it.

And if these monitors are sent 8-Bit color, is it automatically "up processing" it to 8-bit+FRC or does 8-bit+FRC only come into play if sent a 10-bit signal? That is one area my knowledge is lacking.


----------



## Glerox

CallsignVega said:


> And if these monitors are sent 8-Bit color, is it automatically "up processing" it to 8-bit+FRC or does 8-bit+FRC only come into play if sent a 10-bit signal? That is one area my knowledge is lacking.


I was wondering this but according to their page : 
http://www.tftcentral.co.uk/specs.htm#colour depth

"This FRC can be applied either on the panel side (8-bit + FRC panels) or on the monitor LUT/electronics side. Either way, the screen simulates a larger colour depth and does not offer a 'true' 10-bit support. You can also only make use of this 10-bit support if you have a full end-to-end 10-bit workflow, including a supporting software, graphics card and operating system."

Which seems to mean the signal sent from the GPU has to be 10bits to benefit from FRC.

I don't have twitter but it would be a good question to ask them.


----------



## CallsignVega

Ya HDR changes the game though, since HDR is a true 10-bit workflow. So I wonder how the panel/monitor meshes with HDR if the driver is set to 8 or 10 bit.


----------



## kx11

finally VelocityMicro got the monitor in stock and they'll ship mine today , i should get it 3 days from now


----------



## MiniZaid

guys i can't get BF1 HDR working. Everything seems washed out.
Going to try other games
here are my settings

http://tinypic.com/view.php?pic=34dq39d&s=9#.Wz7Hx9JKguU


----------



## Glerox

MiniZaid said:


> guys i can't get BF1 HDR working. Everything seems washed out.
> Going to try other games
> here are my settings
> 
> http://tinypic.com/view.php?pic=34dq39d&s=9#.Wz7Hx9JKguU


Is HDR ON in the monitor OSD?
Can you get HDR working on desktop?
What are your Nvidia control panel settings?
You don't need to turn on HDR in desktop settings, BF1 should detect and turn HDR on automatically.


----------



## MiniZaid

Glerox said:


> Is HDR ON in the monitor OSD?
> Can you get HDR working on desktop?
> What are your Nvidia control panel settings?
> You don't need to turn on HDR in desktop settings, BF1 should detect and turn HDR on automatically.


oh i don't? But it got automatically enabled once i did 10bit and 98hz. I guess i can disable it.
I accidentally scrolled up when snipping the image.
Nvidia settings is 98hz, 10bit, RGB, full range.

monitor settings in the infomation states RGB444, 10bit, full range, HDR-ST2084

EDIT: forgot to update drivers. I'll do that. Didn't bother since I didn't play any of the games that came in the last few months.
EDIT2: so it worked but doesn't look that impressive over non HDR. I'll try like Far Cry 5 which I heard has better HDR implementation.
EDIT3: wow far cry 5 looks amazing. Looking forward for Battlefield V, at least the graphics area...


So how should I test back light bleed? I used ledr.com and full screened. I think i got lucky, there's no backlight bleed? Like none. Much better than the Asus PG279Q (one corner had some)
Although I did get the second batch.


----------



## Glerox

MiniZaid said:


> EDIT2: so it worked but doesn't look that impressive over non HDR. I'll try like Far Cry 5 which I heard has better HDR implementation.
> 
> So how should I test back light bleed? I used ledr.com and full screened. I think i got lucky, there's no backlight bleed? Like none. Much better than the Asus PG279Q (one corner had some)
> Although I did get the second batch.


BF1 HDR is really realistic so when it's cloudy it looks more lire SDR everything is less bright, like in real life.
Try a really sunny map you'll see 

Yup BLB is really minimal due to FALD, mine has a little bit a the top left corner but I don't mind.

You have the second batch? You're lucky it means you don't have black crush in 144Hz SDR? Can't wait to play PUBG in 144Hz.


----------



## MiniZaid

Glerox said:


> BF1 HDR is really realistic so when it's cloudy it looks more lire SDR everything is less bright, like in real life.
> Try a really sunny map you'll see
> 
> Yup BLB is really minimal due to FALD, mine has a little bit a the top left corner but I don't mind.
> 
> You have the second batch? You're lucky it means you don't have black crush in 144Hz SDR? Can't wait to play PUBG in 144Hz.


I tried Rupture map. The sunsets looks good. But the grass and flowers are not eye popping.


I think i do. mine got delayed and got it today.
So how do i make the black crush happen? Just to want to see which batch i have.

but 144hz looks so weird in windows desktop. The colors looks garbage, like a monitor from 10 years ago.


----------



## Glerox

I just uploaded my two cents about the monitor. In the video I show how to test for black crush and also demonstrate the difference between 8bits and 10bits as discussed earlier.


----------



## glenster

Review of the ASUS ROG Swift PG27UQ featuring Ryan Shrout of pcper.com
TWiT Netcast Network


----------



## kot0005

MiniZaid said:


> oh i don't? But it got automatically enabled once i did 10bit and 98hz. I guess i can disable it.
> I accidentally scrolled up when snipping the image.
> Nvidia settings is 98hz, 10bit, RGB, full range.
> 
> monitor settings in the infomation states RGB444, 10bit, full range, HDR-ST2084
> 
> EDIT: forgot to update drivers. I'll do that. Didn't bother since I didn't play any of the games that came in the last few months.
> EDIT2: so it worked but doesn't look that impressive over non HDR. I'll try like Far Cry 5 which I heard has better HDR implementation.
> EDIT3: wow far cry 5 looks amazing. Looking forward for Battlefield V, at least the graphics area...
> 
> 
> So how should I test back light bleed? I used ledr.com and full screened. I think i got lucky, there's no backlight bleed? Like none. Much better than the Asus PG279Q (one corner had some)
> Although I did get the second batch.



SWBF2 is amazing, its cheap now.


----------



## MiniZaid

Glerox said:


> I just uploaded my two cents about the monitor. In the video I show how to test for black crush and also demonstrate the difference between 8bits and 10bits as discussed earlier.


Yeah... i have the black crush. oh well, not really going to bother with 144hz


----------



## Monstieur

Glerox said:


> So basically, once HDR is activated, to my eye, there is NO difference between 8bits and 10bits... neither on the OLED, neither on the PG27UQ
> This would explains why I couldn't see any differences between [email protected] and [email protected] while gaming in HDR
> 
> I checked many times that the signal was really only 8 bits in the OSD and in the app.
> 
> It's interesting I just can't explain why lol.


I've explained it before - as long as the application / game renders to a 10-bit surface, the NVIDIA driver will automatically perform 10-bit to 8-bit dithering. There will not be any visible banding. An 8-bit signal to the display is sufficient for HDR. When you're doing those banding tests, you need to open the sample images in a 10-bit application or the test is invalid. I recommend using MPC-HC with madVR. You must configure madVR and tell it the monitor is 10-bit so that madVR performs its own high quality dithering and outputs 8-bit which avoids the NVIDIA driver dithering.

Now if the application incorrectly renders in 8-bit mode because the monitor is also in 8-bit mode, you will see banding. The only reason to use 10-bit YCbCr422 is when you want to directly send a UHD Blu-ray video to the monitor without any processing, because UHD Blu-rays are natively 10-bit YCbCr422.


----------



## Monstieur

CallsignVega said:


> Ya HDR changes the game though, since HDR is a true 10-bit workflow. So I wonder how the panel/monitor meshes with HDR if the driver is set to 8 or 10 bit.


The game can run in 8-bit mode too if the renderer performs dithering - then you get HDR with a true 8-bit workflow and no banding. If the game outputs 10-bit, then dithering has to occur somewhere down the chain if the monitor is in 8-bit mode.


----------



## Glerox

Monstieur said:


> I've explained it before - as long as the application / game renders to a 10-bit surface, the NVIDIA driver will automatically perform 10-bit to 8-bit dithering. There will not be any visible banding. An 8-bit signal to the display is sufficient for HDR. When you're doing those banding tests, you need to open the sample images in a 10-bit application or the test is invalid. I recommend using MPC-HC with madVR. You must configure madVR and tell it the monitor is 10-bit so that madVR performs its own high quality dithering and outputs 8-bit which avoids the NVIDIA driver dithering.
> 
> Now if the application incorrectly renders in 8-bit mode because the monitor is also in 8-bit mode, you will see banding. The only reason to use 10-bit YCbCr422 is when you want to directly send a UHD Blu-ray video to the monitor without any processing, because UHD Blu-rays are natively 10-bit YCbCr422.


Thanks Monstieur! The DisplayHDR test is a 10bits application as you can see it works when comparing 8bits to 10bits in SDR.

So if I understand well, when playing in HDR, the game renders in 10bits and the Nvidia driver do the dithering before sending it to the monitor? It's hard to me to understand how a 8bits signal can simulate a 10bits signal before being sent to the monitor.

So as I observed, the 8bits signal is as good as the 10bits for HDR gaming? So no reason to use the 98hz mode over the 120Hz mode.


----------



## Sichtwechsel86

Glerox said:


> Thanks Monstieur! The DisplayHDR test is a 10bits application as you can see it works when comparing 8bits to 10bits in SDR.
> 
> So if I understand well, when playing in HDR, the game renders in 10bits and the Nvidia driver do the dithering before sending it to the monitor? It's hard to me to understand how a 8bits signal can simulate a 10bits signal before being sent to the monitor.
> 
> So as I observed, the 8bits signal is as good as the 10bits for HDR gaming? So no reason to use the 98hz mode over the 120Hz mode.


in tried DvDfab 5 with UHD movies... 

In 'The Revenant' there is a visible difference using 8bit HDR and 10bit HDR - 
in 'Valerian' too...

with displayHDR test i get same results as everybody...

maybe someone can explain why this is...


----------



## Glerox

my guess is dvdfab5 does not render to a 10bits surface when it detects a 8bit display, unlike displayHDR and games.


----------



## acmilangr

It seems strange on me that you like far cry 5 on HDR. 
On my eyes it has Just much more brightness on evereywhere. And this is not something supposd to do HDR technology.


----------



## MiniZaid

acmilangr said:


> It seems strange on me that you like far cry 5 on HDR.
> On my eyes it has Just much more brightness on evereywhere. And this is not something supposd to do HDR technology.


it's just battlefield 1 doesn't look too different. The skies and sunset looks good.




and also for cod ww2, i have to enable hdr on windows in order for the option to be changeable in game. For far cry 5 and battlefield 1, i didn't need to do that.


----------



## acmilangr

and also for cod ww2, i have to enable hdr on windows in order for the option to be changeable in game. For far cry 5 and battlefield 1, i didn't need to do that.[/QUOTE]
Does WW2 support HDR? didnt find any option


----------



## CallsignVega

HDR color volume on Windows 10 is broken. That is why HDR looks washed out. MadVR and VLC corrects this for videos, but you must manually do it for games. 

https://vibrancegui.com/

Search for your game.exe, set it to 70% and re-run the game in HDR. Tell me what you think. This trick only works in HDR mode, doing it in SDR will totally over-saturate everything.


----------



## Glerox

CallsignVega said:


> HDR color volume on Windows 10 is broken. That is why HDR looks washed out. MadVR and VLC corrects this for videos, but you must manually do it for games.
> 
> https://vibrancegui.com/
> 
> Search for your game.exe, set it to 70% and re-run the game in HDR. Tell me what you think. This trick only works in HDR mode, doing it in SDR will totally over-saturate everything.


Isn't that a way to just artificially boost colors?

How do you know that the color volume is wrong? maybe it just depends how the game implemented HDR.


----------



## acmilangr

Anyone tried hitman? Mine it crushes when i enable HDR

only mass effect andromeda and far cry 5 succeed to play with hdr.

Andromeda was supperior unlike far cry 5 that it Just had too much brightness and blind my eyes.

I tried also assasin creed origins and it doesnt work. It is very dark with bad colors. Anyone tried it?


----------



## CallsignVega

Glerox said:


> Isn't that a way to just artificially boost colors?
> 
> How do you know that the color volume is wrong? maybe it just depends how the game implemented HDR.


Every single HDR game I tested the colors were washed out and lost saturation versus SDR. That is not suppose to happen. The slider works differently in HDR, it doesn't clip or blowout colors when you do it in HDR versus SDR.

Do a A-B-C comparison of the same game in the same spot in SDR, HDR with the slider at 50% and then HDR with the slider at 70%.


----------



## Sichtwechsel86

CallsignVega said:


> Every single HDR game I tested the colors were washed out and lost saturation versus SDR. That is not suppose to happen. The slider works differently in HDR, it doesn't clip or blowout colors when you do it in HDR versus SDR.
> 
> Do a A-B-C comparison of the same game in the same spot in SDR, HDR with the slider at 50% and then HDR with the slider at 70%.


i have the slider at 1 (minimum) and everything looks fine in HDR!

but of course i am just activating HDR via Windows if needed (for playing HDR movies, yt HDR and Netflix HDR - and if a certain game refuses to do HDR if not switched ON via windows10)


----------



## MiniZaid

CallsignVega said:


> Every single HDR game I tested the colors were washed out and lost saturation versus SDR. That is not suppose to happen. The slider works differently in HDR, it doesn't clip or blowout colors when you do it in HDR versus SDR.
> 
> Do a A-B-C comparison of the same game in the same spot in SDR, HDR with the slider at 50% and then HDR with the slider at 70%.


Even far cry 5? I thought that was pretty saturated.







acmilangr said:


> Does WW2 support HDR? didnt find any option


yes, if you go to video settings and scroll to the bottom and click on advanced settings


----------



## kot0005

CallsignVega said:


> Every single HDR game I tested the colors were washed out and lost saturation versus SDR. That is not suppose to happen. The slider works differently in HDR, it doesn't clip or blowout colors when you do it in HDR versus SDR.
> 
> Do a A-B-C comparison of the same game in the same spot in SDR, HDR with the slider at 50% and then HDR with the slider at 70%.


do I set the application to 70% with HDR on in windows ? I tried it a while back with hdr off in windows and it broke my colors. I had to reinstall nvidia drivers..


----------



## Monstieur

CallsignVega said:


> HDR color volume on Windows 10 is broken. That is why HDR looks washed out. MadVR and VLC corrects this for videos, but you must manually do it for games.
> 
> https://vibrancegui.com/
> 
> Search for your game.exe, set it to 70% and re-run the game in HDR. Tell me what you think. This trick only works in HDR mode, doing it in SDR will totally over-saturate everything.


Windows has nothing to do with the appearance of HDR in games. HDR10 is an absolute standard - it has an absolute brightness for every scene that should look the same on every display (up to the display's maximum brightness, and if dynamic contrast is disabled), and a fixed PQ gamma curve. With a paper white target of 100 nits as per the standard, most scenes will look dim because everyone incorrectly sets their monitor to 250+ nits for SDR content when it should only be 100 nits.

A game that support HDR10 will always target the PQ curve and BT.2020 colour space. Desktop applications will look incorrect because they still render with Gamma 2.2 and the wrong colour space.


----------



## bee144

Update regarding the BF1 scaling issues with G-Sync and SLI: issue was escalated to tier two. They are going to try and reproduce the issue. Might take some time for their HDR monitor to arrive. Sitting tight in the meantime.


----------



## l88bastar

bee144 said:


> Update regarding the BF1 scaling issues with G-Sync and SLI: issue was escalated to tier two. They are going to try and reproduce the issue. Might take some time for their HDR monitor to arrive. Sitting tight in the meantime.


----------



## Glerox

Monstieur said:


> Windows has nothing to do with the appearance of HDR in games. HDR10 is an absolute standard - it has an absolute brightness for every scene that should look the same on every display (up to the display's maximum brightness, and if dynamic contrast is disabled), and a fixed PQ gamma curve. With a paper white target of 100 nits as per the standard, most scenes will look dim because everyone incorrectly sets their monitor to 250+ nits for SDR content when it should only be 100 nits.
> 
> A game that support HDR10 will always target the PQ curve and BT.2020 colour space. Desktop applications will look incorrect because they still render with Gamma 2.2 and the wrong colour space.


Amen!


----------



## CallsignVega

Monstieur said:


> Windows has nothing to do with the appearance of HDR in games. HDR10 is an absolute standard - it has an absolute brightness for every scene that should look the same on every display (up to the display's maximum brightness, and if dynamic contrast is disabled), and a fixed PQ gamma curve. With a paper white target of 100 nits as per the standard, most scenes will look dim because everyone incorrectly sets their monitor to 250+ nits for SDR content when it should only be 100 nits.
> 
> A game that support HDR10 will always target the PQ curve and BT.2020 colour space. Desktop applications will look incorrect because they still render with Gamma 2.2 and the wrong colour space.


I am not talking about desktop applications. I am talking about a game, running in exclusive full-screen mode HDR trigger by the driver. Almost all settings are disabled on the monitor in HDR mode.

SDR picture:




HDR picture:




The ONLY difference between these two screenshots is turning HDR10 on or off in the game menu, verified by the monitor as to being in their respective modes. 

Look how washed out the HDR picture is. Check the color differences, especially on the team ticker bars center top (blue ticker bar is almost gray in HDR). The HDR luminosity is correct but color is NOT getting processed right for either the color volume or saturation. This problem is identical on my C8 OLED too. And this problem exists in ALL of my HDR games. I also really noticed it in SWBF2 when I loaded up a Sith and the light saber was literally light pink and not red. 

I'm not sure what is causing it, if it's my Titan V or driver problem. That's why I want other people to test. Load up a game, go to a static colored spot in the game, take pictures in SDR and then in HDR and see if the HDR looks washed out looking back and forth at the pictures (don't rely on your memory).


----------



## Glerox

I'll try it!


----------



## kx11

CallsignVega said:


> I am not talking about desktop applications. I am talking about a game, running in exclusive full-screen mode HDR trigger by the driver. Almost all settings are disabled on the monitor in HDR mode.
> 
> SDR picture:
> 
> 
> 
> HDR picture:
> 
> 
> 
> The ONLY difference between these two screenshots is turning HDR10 on or off in the game menu, verified by the monitor as to being in their respective modes.
> 
> Look how washed out the HDR picture is. Check the color differences, especially on the team ticker bars center top (blue ticker bar is almost gray in HDR). The HDR luminosity is correct but color is NOT getting processed right for either the color volume or saturation. This problem is identical on my C8 OLED too. And this problem exists in ALL of my HDR games. I also really noticed it in SWBF2 when I loaded up a Sith and the light saber was literally light pink and not red.
> 
> I'm not sure what is causing it, if it's my Titan V or driver problem. That's why I want other people to test. Load up a game, go to a static colored spot in the game, take pictures in SDR and then in HDR and see if the HDR looks washed out looking back and forth at the pictures (don't rely on your memory).





if you're using a phone camera make sure HDR is enabled when you take a photo


----------



## MiniZaid

acmilangr said:


> and also for cod ww2, i have to enable hdr on windows in order for the option to be changeable in game. For far cry 5 and battlefield 1, i didn't need to do that.


Does WW2 support HDR? didnt find any option[/QUOTE]



CallsignVega said:


> I am not talking about desktop applications. I am talking about a game, running in exclusive full-screen mode HDR trigger by the driver. Almost all settings are disabled on the monitor in HDR mode.
> 
> SDR picture:
> 
> 
> HDR picture:
> 
> 
> The ONLY difference between these two screenshots is turning HDR10 on or off in the game menu, verified by the monitor as to being in their respective modes.
> 
> Look how washed out the HDR picture is. Check the color differences, especially on the team ticker bars center top (blue ticker bar is almost gray in HDR). The HDR luminosity is correct but color is NOT getting processed right for either the color volume or saturation. This problem is identical on my C8 OLED too. And this problem exists in ALL of my HDR games. I also really noticed it in SWBF2 when I loaded up a Sith and the light saber was literally light pink and not red.
> 
> I'm not sure what is causing it, if it's my Titan V or driver problem. That's why I want other people to test. Load up a game, go to a static colored spot in the game, take pictures in SDR and then in HDR and see if the HDR looks washed out looking back and forth at the pictures (don't rely on your memory).


that looks like HDR is not working. Have you tried enabling HDR in windows 10?
That might solve the issue. Although I don't need to enable HDR in windows 10 to get HDR working in BF1 or far cry 5


----------



## Sichtwechsel86

CallsignVega said:


> I am not talking about desktop applications. I am talking about a game, running in exclusive full-screen mode HDR trigger by the driver. Almost all settings are disabled on the monitor in HDR mode.
> 
> SDR picture:
> 
> 
> 
> 
> HDR picture:
> 
> 
> 
> 
> The ONLY difference between these two screenshots is turning HDR10 on or off in the game menu, verified by the monitor as to being in their respective modes.
> 
> Look how washed out the HDR picture is. Check the color differences, especially on the team ticker bars center top (blue ticker bar is almost gray in HDR). The HDR luminosity is correct but color is NOT getting processed right for either the color volume or saturation. This problem is identical on my C8 OLED too. And this problem exists in ALL of my HDR games. I also really noticed it in SWBF2 when I loaded up a Sith and the light saber was literally light pink and not red.
> 
> I'm not sure what is causing it, if it's my Titan V or driver problem. That's why I want other people to test. Load up a game, go to a static colored spot in the game, take pictures in SDR and then in HDR and see if the HDR looks washed out looking back and forth at the pictures (don't rely on your memory).


that is very strange...
i don't experience this problem with my gear...

if switching on HDR - it looks more colorful, brighter and not washed out - 
the whole opposit...
after playing BF1 with HDR enabled - the same game in SDR looks dull and lifeless... (but in any case normal, to what i expect from SDR!)

when i had the Acer X27 - i did not experience this problem either...

---

Can someone please tell me what the option REFERENCE WHITE does?

It seems to raise or lower the exposure level...

it comes at a standardlevel of 80 nits - but i thought the standard exposure for movies is 100 nits...

How did you set your REFERENCE WHITE?


----------



## kot0005

CallsignVega said:


> I am not talking about desktop applications. I am talking about a game, running in exclusive full-screen mode HDR trigger by the driver. Almost all settings are disabled on the monitor in HDR mode.
> 
> SDR picture:
> 
> 
> 
> 
> HDR picture:
> 
> 
> 
> 
> The ONLY difference between these two screenshots is turning HDR10 on or off in the game menu, verified by the monitor as to being in their respective modes.
> 
> Look how washed out the HDR picture is. Check the color differences, especially on the team ticker bars center top (blue ticker bar is almost gray in HDR). The HDR luminosity is correct but color is NOT getting processed right for either the color volume or saturation. This problem is identical on my C8 OLED too. And this problem exists in ALL of my HDR games. I also really noticed it in SWBF2 when I loaded up a Sith and the light saber was literally light pink and not red.
> 
> I'm not sure what is causing it, if it's my Titan V or driver problem. That's why I want other people to test. Load up a game, go to a static colored spot in the game, take pictures in SDR and then in HDR and see if the HDR looks washed out looking back and forth at the pictures (don't rely on your memory).



Ok I can see why you say washed out. What map is this so I can test it ? I am pretty sure my HDR colors look like in your SDR Photo


----------



## kot0005

http://imgur.com/a/r36UN0i


----------



## acmilangr

Sichtwechsel86 said:


> that is very strange...
> i don't experience this problem with my gear...
> 
> if switching on HDR - it looks more colorful, brighter and not washed out -
> the whole opposit...
> after playing BF1 with HDR enabled - the same game in SDR looks dull and lifeless... (but in any case normal, to what i expect from SDR!)
> 
> when i had the Acer X27 - i did not experience this problem either...
> 
> ---
> 
> Can someone please tell me what the option REFERENCE WHITE does?
> 
> It seems to raise or lower the exposure level...
> 
> it comes at a standardlevel of 80 nits - but i thought the standard exposure for movies is 100 nits...
> 
> How did you set your REFERENCE WHITE?


Just let it 80


----------



## kot0005

http://imgur.com/a/etBg5Il

so yeah my leaf color in hdr and sdr is pretty much the same, only thing that changes is the lighting. In sdr reflections r dull, in hdr they look realistic. Took these with s8 hdr on.


----------



## badjz

kot0005 said:


> http://imgur.com/a/etBg5Il
> 
> so yeah my leaf color in hdr and sdr is pretty much the same, only thing that changes is the lighting. In sdr reflections r dull, in hdr they look realistic. Took these with s8 hdr on.


Yes, my results are consistent with yours;

https://imgur.com/a/PTtTvKt

Note the dramatic difference in destiny 2. We need some more HDR content, I’m getting bored with the current catalogue.


----------



## kot0005

PS4 pro games, here you can tell, HDR colors r a bit vibrant obviously missing out on the lighting effects again here..

https://imgur.com/a/jxexahY

using a 8700k at 5.0GHZ and 1080Ti which boosts to 2025Mhz for my PC



badjz said:


> Yes, my results are consistent with yours;
> 
> https://imgur.com/a/PTtTvKt
> 
> Note the dramatic difference in destiny 2. We need some more HDR content, I’m getting bored with the current catalogue.


yes D2 colors are way more saturated than BF1/BF2 in HDR.


----------



## deadchip12

kot0005 said:


> PS4 pro games, here you can tell, HDR colors r a bit vibrant obviously missing out on the lighting effects again here..
> 
> https://imgur.com/a/jxexahY
> 
> using a 8700k at 5.0GHZ and 1080Ti which boosts to 2025Mhz for my PC
> 
> 
> 
> badjz said:
> 
> 
> 
> Yes, my results are consistent with yours;
> 
> https://imgur.com/a/PTtTvKt
> 
> Note the dramatic difference in destiny 2. We need some more HDR content, I’m getting bored with the current catalogue.
> 
> 
> 
> yes D2 colors are way more saturated than BF1/BF2 in HDR.
Click to expand...

Damn PS4 Pro games seem to look fantastic on this monitor.

But why the hell it says 12 bits in HDR lol? Also, is there a reason why you choose racing mode in the osd?


----------



## kx11

PS4 pro got 2 types of HDR , RGB and yuv422 , maybe enabling RGB makes it think it's running 12 bit 





Xbox one X had 12 bit depth for a while now and you can enable it manually


----------



## acmilangr

What option do you all use? RGB or 4:4:4?


----------



## Sichtwechsel86

deadchip12 said:


> Also, is there a reason why you choose racing mode in the osd?


I thought Racing-Mode is standard on this monitor...

what would you recommend?

And what settings are you using within your preferred mode?


----------



## kx11

acmilangr said:


> What option do you all use? RGB or 4:4:4?





usually i go for RGB


----------



## acmilangr

Go and download "LG Cynatic jazz HDR 4K Demo". It is one of the most impressive HDR videos.

http://4kmedia.org/lg-cymatic-jazz-hdr-hlg-uhd-4k-demo/


----------



## Bloodmosher

Bloodmosher said:


> My PG279Q is far better for desktop use - it is brighter in SDR; the only way I can get the PG27U to have as bright whites is to turn on HDR; but that breaks Chrome - it shows gray unless you disabled hardware acceleration which makes Chrome unusable. And the white consistency issue is starting to drive me crazy; I've reached out to ASUS to see if they'll swap this thing. If not, I may return. It's a tough one - it is sooo good for BFV, but I was really hoping to have my center monitor be "the" one for both desktop and games.


Well, mine went back today. I'm going to try another unit. The white consistency problem on the desktop was driving me mad. Top half of email/web pages were white, lower half noticeably yellow. For 2K I should be able to read email as well as play HDR games.


----------



## acmilangr

kx11 said:


> usually i go for RGB


I prefer that also becouse only with RGB is the output dynamic range full and not limited.

Or do i have some issue? What is your option when you choose 4:4:4?mine is limited only. 

Another question to everyone. In SDR Mode i dont like the White color.it is not really White. My previous monitor had better.

Have you find any way on monitor settings to improve it?


----------



## Bloodmosher

acmilangr said:


> I prefer that also becouse only with RGB is the output dynamic range full and not limited.
> 
> Or do i have some issue? What is your option when you choose 4:4:4?mine is limited only.
> 
> Another question to everyone. In SDR Mode i dont like the White color.it is not really White. My previous monitor had better.
> 
> Have you find any way on monitor settings to improve it?


Under color you can find the color temp setting, use User Mode instead of normal, warm, or cool. On mine I ended up using R:46,G:46,B:100 to match the "white" of my other monitors. I also used Brightness of 100 and Contrast of 50.


----------



## kot0005

deadchip12 said:


> Damn PS4 Pro games seem to look fantastic on this monitor.
> 
> But why the hell it says 12 bits in HDR lol? Also, is there a reason why you choose racing mode in the osd ?


I have no idea, its prob what ps4 games use so the monitor is getting 12bit signal. Racing mode is the one that looks normal to me. I cant use sRGB mode because the brightness is way too much and you cant adjust brightness in this mode..



kx11 said:


> PS4 pro got 2 types of HDR , RGB and yuv422 , maybe enabling RGB makes it think it's running 12 bit
> 
> Xbox one X had 12 bit depth for a while now and you can enable it manually


I cant select Yuv 422mode with the monitor, its greyedout.


----------



## deadchip12

kot0005 said:


> I cant use sRGB mode because the brightness is way too much and you cant adjust brightness in this mode..


I thought brightness is always max in HDR?


----------



## acmilangr

Bloodmosher said:


> Under color you can find the color temp setting, use User Mode instead of normal, warm, or cool. On mine I ended up using R:46,G:46,B:100 to match the "white" of my other monitors. I also used Brightness of 100 and Contrast of 50.


thanks. i'll give it a try.
but with 100 blue are the colors correct?


----------



## kot0005

deadchip12 said:


> I thought brightness is always max in HDR?


its for SDR usage..


----------



## deadchip12

Glerox said:


> I just uploaded my two cents about the monitor. In the video I show how to test for black crush and also demonstrate the difference between 8bits and 10bits as discussed earlier.
> 
> https://www.youtube.com/watch?v=HRepgcXxeaw&feature=youtu.be


Hey Glerox, want to ask you a question.

If I pause the video at 13:56, I see no blooming around the white text against black background (the top picture). But a second later, after the camera adjusts its exposure I assume, at 13:57, the blooming is very apparent (the bottom picture)


















What is closer to what you see in real? I guess sth in the middle. It would be great if you can manually adjust the exposure and take a picture that reflects the correct amount of blooming.


----------



## acmilangr

I tried to do the best i can to get close to white color. I tried R:46,G:46,B:100 as Bloodmosher suggested but it was too blue.

So my settings are these:
R:66,G:76,B:100

If you want try them and tell me if you are like it anyone.

-------------------------------------------------

What about gamma? i think 2.6 is the best


----------



## acmilangr

What is going here. Was i blind or my monitor have much more blooming now?
Even in SDR i See too much blooming. 
Could you please make Windows background Black color and drag mouse on the right side of the monitor? Do you have blooming (on SDR)? Please check everyone.


----------



## Glerox

deadchip12 said:


> Hey Glerox, want to ask you a question.
> 
> If I pause the video at 13:56, I see blooming around the white text against black background (the top picture). But a second later, after the camera adjusts its exposure I assume, at 13:57, the blooming is very apparent (the bottom picture)
> 
> What is closer to what you see in real? I guess sth in the middle. It would be great if you can manually adjust the exposure and take a picture that reflects the correct amount of blooming.


It's closer to the one no showing no blooming. I tried to take some pictures with adjusted exposure but I don't know why I just can't upload picture to the forum... it's not working when I drag and drop files... any idea why?



CallsignVega said:


> I am not talking about desktop applications. I am talking about a game, running in exclusive full-screen mode HDR trigger by the driver. Almost all settings are disabled on the monitor in HDR mode.
> 
> The ONLY difference between these two screenshots is turning HDR10 on or off in the game menu, verified by the monitor as to being in their respective modes.
> 
> Look how washed out the HDR picture is. Check the color differences, especially on the team ticker bars center top (blue ticker bar is almost gray in HDR). The HDR luminosity is correct but color is NOT getting processed right for either the color volume or saturation. This problem is identical on my C8 OLED too. And this problem exists in ALL of my HDR games. I also really noticed it in SWBF2 when I loaded up a Sith and the light saber was literally light pink and not red.
> 
> I'm not sure what is causing it, if it's my Titan V or driver problem. That's why I want other people to test. Load up a game, go to a static colored spot in the game, take pictures in SDR and then in HDR and see if the HDR looks washed out looking back and forth at the pictures (don't rely on your memory).


Vega I tested the SDR/HDR difference in multiples maps in BF1 and the colors don't look washed out in HDR. In fact, they look exactly the same in most cases and sometimes they even look more realistic.
I don't use any vibrance or NVCP option (it's all default colors settings) in SDR neither in HDR.

Don't know why yours are like that :S


----------



## deadchip12

Glerox said:


> deadchip12 said:
> 
> 
> 
> Hey Glerox, want to ask you a question.
> 
> If I pause the video at 13:56, I see blooming around the white text against black background (the top picture). But a second later, after the camera adjusts its exposure I assume, at 13:57, the blooming is very apparent (the bottom picture)
> 
> What is closer to what you see in real? I guess sth in the middle. It would be great if you can manually adjust the exposure and take a picture that reflects the correct amount of blooming.
> 
> 
> 
> It's closer to the one no showing no blooming. I tried to take some pictures with adjusted exposure but I don't know why I just can't upload picture to the forum... it's not working when I drag and drop files... any idea why?
Click to expand...

Try posting to a site like imgur and paste the link here?


----------



## acmilangr

I dont know What the hell happened but now i See too blooming on SDR and HDR. I dont remember to was So bad Before. 

Look on image on desktop (SDR) on right bottom corner blooming becouse of the icon. 

Anyone tried on Black wallpaper? 

https://imgur.com/a/sAsfVr8


----------



## deadchip12

acmilangr said:


> I dont know What the hell happened but now i See too blooming on SDR and HDR. I dont remember to was So bad Before.
> 
> Look on image on desktop (SDR) on right bottom corner blooming becouse of the icon.
> 
> Anyone tried on Black wallpaper?
> 
> https://imgur.com/a/sAsfVr8


It looks exactly like that in real? Looks like extremely overexposured pics. But still, really bad


----------



## acmilangr

deadchip12 said:


> It looks exactly like that in real? Looks like extremely overexposured pics. But still, really bad


It is actually about 30% less that was in the photo.


----------



## acmilangr

https://www.youtube.com/watch?time_continue=2&v=u8iIZCDizkc

spot on 18:49 
actually this is how it is on mine. exactly like that.

i think it was allready the monitor like this. but my eyes forgot to see that and stucked on amazing brightness and colors that HDR produce


----------



## deadchip12

acmilangr said:


> https://www.youtube.com/watch?time_continue=2&v=u8iIZCDizkc
> 
> spot on 18:49
> actually this is how it is on mine. exactly like that.
> 
> i think it was allready the monitor like this. but my eyes forgot to see that and stucked on amazing brightness and colors that HDR produce


Stuffs like this really make me consider going for a Sony XE93 instead of these monitors. The handling if blooming looks to be much better on that tv.


----------



## acmilangr

deadchip12 said:


> Stuffs like this really make me consider going for a Sony XE93 instead of these monitors. The handling if blooming looks to be much better on that tv.


if you dont need 4k/120hz,G-sync then why not


----------



## deadchip12

acmilangr said:


> deadchip12 said:
> 
> 
> 
> Stuffs like this really make me consider going for a Sony XE93 instead of these monitors. The handling if blooming looks to be much better on that tv.
> 
> 
> 
> if you dont need 4k/120hz,G-sync then why not
Click to expand...

Well i thought asus 384 local dimming zones is better than sony edge lit 60 zones


----------



## acmilangr

Well. Now i understood why i thought it was worst yesterday. Becouse i tested on very dark room. 

If there is Light in the room it is much less the blooming.


----------



## acmilangr

deadchip12 said:


> Well i thought asus 384 local dimming zones is better than sony edge lit 60 zones


I think it has to do With algorithm. It would be better on these situations (like stars in the night) to low the brightness on these local leds or to bright more of Them in bigest distance.


----------



## kot0005

Glerox said:


> It's closer to the one no showing no blooming. I tried to take some pictures with adjusted exposure but I don't know why I just can't upload picture to the forum... it's not working when I drag and drop files... any idea why?
> 
> 
> 
> Vega I tested the SDR/HDR difference in multiples maps in BF1 and the colors don't look washed out in HDR. In fact, they look exactly the same in most cases and sometimes they even look more realistic.
> I don't use any vibrance or NVCP option (it's all default colors settings) in SDR neither in HDR.
> 
> Don't know why yours are like that :S


OCN admin screwed up when they upograded the site..


----------



## acmilangr

My next step is to make my monitor glossy!


----------



## Sichtwechsel86

deadchip12 said:


> Stuffs like this really make me consider going for a Sony XE93 instead of these monitors. The handling if blooming looks to be much better on that tv.


just order it, 
check it out for yourself and keep it or send it back...

you will not have any other option than to inspect it for yourself...

BLB wise my PG27UQ was nearly perfect - haloing was visible (especially in Star Wars Episode 8 and Chess Demo)... but it was okay-ish - 
i just sent it back because of the dead pixels my panel had...


----------



## deadchip12

Sichtwechsel86 said:


> deadchip12 said:
> 
> 
> 
> Stuffs like this really make me consider going for a Sony XE93 instead of these monitors. The handling if blooming looks to be much better on that tv.
> 
> 
> 
> just order it,
> check it out for yourself and keep it or send it back...
> 
> you will not have any other option than to inspect it for yourself...
> 
> BLB wise my PG27UQ was nearly perfect - haloing was visible (especially in Star Wars Episode 8 and Chess Demo)... but it was okay-ish -
> i just sent it back because of the dead pixels my panel had...
Click to expand...

Yes I will order the PG27UQ and return if it sucks. The thing is I cannot do the same for XE93, so I cannot compare them. If XE93 gives higher image quality, why should I keep PG27UQ? I dont need high refresh rate, single gpu cannot push high fps anyway. Gsync? Is it worth it to pay extra $1000 for gsync and some ms lower input lag but inferior image quality? Most likely not.


----------



## Glerox

deadchip12 said:


> Try posting to a site like imgur and paste the link here?


There you go. I tried to make it look how it is in real life.

https://imgur.com/a/M2tQDTB


----------



## deadchip12

Glerox said:


> deadchip12 said:
> 
> 
> 
> Try posting to a site like imgur and paste the link here?
> 
> 
> 
> There you go. I tried to make it look how it is in real life.
> 
> https://imgur.com/a/M2tQDTB
Click to expand...

That low brightness blooming looks good. Max brightness blooming is terrible though. Thanks for the pics.


----------



## kot0005

Time to start adding people to Ignore list..


----------



## badjz

Well my monitor might be busted. Not a hiccup over the last 3 weeks, working beautifully, & all of sudden, HDR has stopped working. I’m getting tearing when HDR is on in games, have never experienced this before. 
Despite the notification coming on, it is definitely not kicking in, no change v SDR. Have tried reinstalling drivers, but no cigar. Anyone have the same experience?


----------



## kot0005

badjz said:


> Well my monitor might be busted. Not a hiccup over the last 3 weeks, working beautifully, & all of sudden, HDR has stopped working. I’m getting tearing when HDR is on in games, have never experienced this before.
> Despite the notification coming on, it is definitely not kicking in, no change v SDR. Have tried reinstalling drivers, but no cigar. Anyone have the same experience?



is your refresh rate set to 120hz ? If your frame rate goes above 120fps u will get tearing. Thats how G-sync works. You need to set a cap so your gpu doesnt push more than 116/117 fps. This isnt an issue at 1440p because the GPU is alwways maxing out the frames to the monitor. Read up on G-sync.

You need to turn on V-sync in NVCP but off in game. This way V-sync is only enabled wen ur framerate goes over max refresh and is disabled below 121.


----------



## badjz

kot0005 said:


> badjz said:
> 
> 
> 
> Well my monitor might be busted. Not a hiccup over the last 3 weeks, working beautifully, & all of sudden, HDR has stopped working. I’m getting tearing when HDR is on in games, have never experienced this before.
> Despite the notification coming on, it is definitely not kicking in, no change v SDR. Have tried reinstalling drivers, but no cigar. Anyone have the same experience?
> 
> 
> 
> 
> is your refresh rate set to 120hz ? If your frame rate goes above 120fps u will get tearing. Thats how G-sync works. You need to set a cap so your gpu doesnt push more than 116/117 fps. This isnt an issue at 1440p because the GPU is alwways maxing out the frames to the monitor. Read up on G-sync.
> 
> You need to turn on V-sync in NVCP but off in game. This way V-sync is only enabled wen ur framerate goes over max refresh and is disabled below 121.
Click to expand...

Yep always set frame cap & refresh rate is 120. Have had no issues over the last 3 weeks, it has just started tearing now & HDR fails to kick in.


----------



## badjz

CallsignVega said:


> Monstieur said:
> 
> 
> 
> Windows has nothing to do with the appearance of HDR in games. HDR10 is an absolute standard - it has an absolute brightness for every scene that should look the same on every display (up to the display's maximum brightness, and if dynamic contrast is disabled), and a fixed PQ gamma curve. With a paper white target of 100 nits as per the standard, most scenes will look dim because everyone incorrectly sets their monitor to 250+ nits for SDR content when it should only be 100 nits.
> 
> A game that support HDR10 will always target the PQ curve and BT.2020 colour space. Desktop applications will look incorrect because they still render with Gamma 2.2 and the wrong colour space.
> 
> 
> 
> I am not talking about desktop applications. I am talking about a game, running in exclusive full-screen mode HDR trigger by the driver. Almost all settings are disabled on the monitor in HDR mode.
> 
> SDR picture:
> 
> 
> 
> 
> HDR picture:
> 
> 
> 
> 
> The ONLY difference between these two screenshots is turning HDR10 on or off in the game menu, verified by the monitor as to being in their respective modes.
> 
> Look how washed out the HDR picture is. Check the color differences, especially on the team ticker bars center top (blue ticker bar is almost gray in HDR). The HDR luminosity is correct but color is NOT getting processed right for either the color volume or saturation. This problem is identical on my C8 OLED too. And this problem exists in ALL of my HDR games. I also really noticed it in SWBF2 when I loaded up a Sith and the light saber was literally light pink and not red.
> 
> I'm not sure what is causing it, if it's my Titan V or driver problem. That's why I want other people to test. Load up a game, go to a static colored spot in the game, take pictures in SDR and then in HDR and see if the HDR looks washed out looking back and forth at the pictures (don't rely on your memory).
Click to expand...

Confirming I have the exact same issue mate. It would seem this issue has manifested itself as I have no issues with HDR over the last few weeks. Only way I can get HDR to kick in is by adjusting the vibrance in NCP. This is only in games, YouTube & MC with madvr is fine. 
I have sli 1080tis, running latest nvidia drivers. Let me know if u find a solution or want me to test anyhthing on my end.


----------



## Monstieur

CallsignVega said:


> I am not talking about desktop applications. I am talking about a game, running in exclusive full-screen mode HDR trigger by the driver. Almost all settings are disabled on the monitor in HDR mode.
> 
> SDR picture:
> 
> HDR picture:
> 
> The ONLY difference between these two screenshots is turning HDR10 on or off in the game menu, verified by the monitor as to being in their respective modes.
> 
> Look how washed out the HDR picture is. Check the color differences, especially on the team ticker bars center top (blue ticker bar is almost gray in HDR). The HDR luminosity is correct but color is NOT getting processed right for either the color volume or saturation. This problem is identical on my C8 OLED too. And this problem exists in ALL of my HDR games. I also really noticed it in SWBF2 when I loaded up a Sith and the light saber was literally light pink and not red.
> 
> I'm not sure what is causing it, if it's my Titan V or driver problem. That's why I want other people to test. Load up a game, go to a static colored spot in the game, take pictures in SDR and then in HDR and see if the HDR looks washed out looking back and forth at the pictures (don't rely on your memory).


The monitor must be in sRGB mode for SDR (with a brightness of 100 nits). Only then is the comparison to HDR valid.

If the monitor is in SDR Racing mode it will have an expanded colour gamut and look more vibrant. The same thing applies to the LG C8 - the SDR Game mode is locked to wide gamut, and the other settings are also grossly incorrect. You must set the HDMI input to the PC icon to get normal gamut (sRGB) in SDR Game mode, and copy the other settings from the Technicolor Expert mode.


----------



## kot0005

badjz said:


> Confirming I have the exact same issue mate. It would seem this issue has manifested itself as I have no issues with HDR over the last few weeks. Only way I can get HDR to kick in is by adjusting the vibrance in NCP. This is only in games, YouTube & MC with madvr is fine.
> I have sli 1080tis, running latest nvidia drivers. Let me know if u find a solution or want me to test anyhthing on my end.


Try SLI disabled.


----------



## badjz

kot0005 said:


> badjz said:
> 
> 
> 
> Confirming I have the exact same issue mate. It would seem this issue has manifested itself as I have no issues with HDR over the last few weeks. Only way I can get HDR to kick in is by adjusting the vibrance in NCP. This is only in games, YouTube & MC with madvr is fine.
> I have sli 1080tis, running latest nvidia drivers. Let me know if u find a solution or want me to test anyhthing on my end.
> 
> 
> 
> Try SLI disabled.
Click to expand...

Have tried that, no change.


----------



## kot0005

badjz said:


> Have tried that, no change.



RIP 

you could try a different PC , if not claim warranty.


----------



## kot0005

Someone trolling on rog forums again... 


Quoting him .."But when 384 zones IPS is worse that edge lit 60 zones VA then it comes the question of value per dollar. Should I pay $2500 for 27 inch Gsync 144 Hz monitor with inferior image quality or $1500 for 55 inch TV 60Hz with better image quality?"


----------



## acmilangr

kot0005 said:


> Someone trolling on rog forums again...
> 
> 
> Quoting him .."But when 384 zones IPS is worse that edge lit 60 zones VA then it comes the question of value per dollar. Should I pay $2500 for 27 inch Gsync 144 Hz monitor with inferior image quality or $1500 for 55 inch TV 60Hz with better image quality?"


just ignore them


----------



## deadchip12

kot0005 said:


> Someone trolling on rog forums again...
> 
> 
> Quoting him .."But when 384 zones IPS is worse that edge lit 60 zones VA then it comes the question of value per dollar. Should I pay $2500 for 27 inch Gsync 144 Hz monitor with inferior image quality or $1500 for 55 inch TV 60Hz with better image quality?"


So you're saying Lim's Cave is trolling in his review of Acer X27? He literally says XE93 has better HDR image and less blooming


----------



## CallsignVega

Once again completely irrelevant. 4K 60 Hz TV's are NOT in the same category has 144 Hz 4K gaming displays. If high refresh rate isn't a consideration and image quality is, you go straight to OLED TV's. But be prepared to suffer at 60 Hz.


----------



## lumbeechief

Any news on Amazon release? I'm been patiently waiting for a while now! Newegg says Release Date: 7/13/2018 even though there are 18 reviews for it. I live in the USA by the way.


----------



## Glerox

Anyone playing Far Cry 5 with HDR on this monitor? Even if I put the "paper white" in-game setting to its lowest, some areas are really too bright.
I can't find how to set a photorealistic picture like BF1.
Maybe the HDR is just broken in this game.


----------



## kx11

Glerox said:


> Anyone playing Far Cry 5 with HDR on this monitor? Even if I put the "paper white" in-game setting to its lowest, some areas are really too bright.
> I can't find how to set a photorealistic picture like BF1.
> Maybe the HDR is just broken in this game.



yeah it is , they hide the ugliness of this game with the blinding super powerful sunlight


----------



## Glerox

kx11 said:


> yeah it is , they hide the ugliness of this game with the blinding super powerful sunlight


Actually I think the graphics look good! I like the really detailed 4k textures and character animations are crazy. And it has good SLI scaling.

However, it seems that I'll have to turn off HDR to avoid burning my retina


----------



## acmilangr

Glerox said:


> Anyone playing Far Cry 5 with HDR on this monitor? Even if I put the "paper white" in-game setting to its lowest, some areas are really too bright.
> I can't find how to set a photorealistic picture like BF1.
> Maybe the HDR is just broken in this game.


yes i agree. the HDR is just broken in this game


----------



## profundido

Glerox said:


> Actually I think the graphics look good! I like the really detailed 4k textures and character animations are crazy. And it has good SLI scaling.
> 
> However, it seems that I'll have to turn off HDR to avoid burning my retina


Being supersensitive to brightness I feel the same alot more often than you. When it's accenting darker colors (blue-green-red) like the lady in the opening scene of the "chess" demo for example I love HDR. But as soon as colors appear that are white or get close to it, it's horrible and devastating to my health.

I cannot help but feel that we are missing some sort of post-processing algorithm on the nvidia driver and/or monitor that applies the level of brightness individually in a hyperbolic relation to the colors depending on how close the color is to white. The current flashlight situation is really not good nor healthy imho


----------



## deadchip12

Glerox said:


> Actually I think the graphics look good! I like the really detailed 4k textures and character animations are crazy. And it has good SLI scaling.
> 
> However, it seems that I'll have to turn off HDR to avoid burning my retina


Imagine if future monitors have 4000 nits


----------



## Tallblacksoul

badjz said:


> Well my monitor might be busted. Not a hiccup over the last 3 weeks, working beautifully, & all of sudden, HDR has stopped working. I’m getting tearing when HDR is on in games, have never experienced this before.
> Despite the notification coming on, it is definitely not kicking in, no change v SDR. Have tried reinstalling drivers, but no cigar. Anyone have the same experience?


I didn't have this exact problem, but my FALD stopped working properly suddenly during an input switch. I thought the monitor was toast because nothing seemed to help. Finally I unplugged the ac adapter and waited a bit. Fixed the issue. Maybe try that....


----------



## MiniZaid

Does my monitor have issues?

http://i68.tinypic.com/2d17ntv.jpg

if you look carefully, on a grey background, the middle part shows like a 3 inch bar that's slightly darker and so does the top part 2 inches. It looks slightly worse than the image uploaded. 
And i sort of have to sit back like 5m away to see it.

I have seen worse for grey background on monitors, like multiple bars moving around. Just wondering you guys' opinion


----------



## kot0005

linus has this monitor back at his crib ? video was shot before they had it sent back? who knows ..


----------



## Glerox

kot0005 said:


> linus has this monitor back at his crib ? video was shot before they had it sent back? who knows .. https://www.youtube.com/watch?v=thc9iLZf0HQ


He should just ship it to tftcentral for a proper review lol.


----------



## Glerox

Anybody thinks we'll soon be able to capture 4K HDR footage with Nvidia Shadowplay, OBS or Windows DVR?
I don't see any reason from an hardware perspective.
The Xbox one X is able to capture 4K HDR... don't see why a full blown PC couldn't...

There is capture cards but you're stuck with HDMI 2.0 pass-trough which sucks with the PG27UQ.


----------



## lumbeechief

What the fuc* is going on around here??? The Newegg released date changed from "Release Date: 7/13/2018" to "OUT OF STOCK. ETA: 8/9/2018" just now. I just want the actual USA release date of this damn monitor already, not preorder bullshi*! Hell, the monitor isn't even listed on Amazon nor fulfilled by them with prime shipping.


----------



## Glerox

The updated firmware is probably not ready...


----------



## Morkai

While I was really suspicious of the haloing, and fully expected to return it, I bought one of these to try. There's no chance it is going back 
The HDR implementation is excellent (but not perfect - it's better than any non-oled HDR tv I have seen in showrooms. It easily beats my VT60 plasma, since that doesn't support HDR. OLED is still better, but it's not by a lot) and the colors, uniformity and sdr dimmed contrast is great.
It is perfectly acceptable to watch HDR content on it in a dark room.
It has been said before, but you only really see the haloing with the mousepointer on black, or the occational scene where a small lightsource is surrounded by black. (like a small burning torch in darkness in a movie). 99.9% of the time, its a non-issue, and this monitor is simply so much better than anything that has previously existed.

Games that natively support HDR trough ingame options seem to run without issue in 144Hz, apart from that I just run 98Hz on desktop.

Some observations:
*Firefox does not support HDR playback - but looks fine in general.
*Chrome does play HDR content, but seems to adjust to hdr colorspace (ignores the windows sdr slider) and just looks too dim. Maybe some option can fix this, but I haven't bothered looking.
*Edge plays hdr perfectly, and looks correct too in general; seems like the best hdr browser currently.
*Netflix has a lot of HDR 4k content, but you need to upgrade to premium.
*Altered carbon is the best showcase for HDR i have found. Netflix 4k HDR.
*Amazon prime has some HDR content, but has so much less content in general.
*HBO seems to only support 1080p non hdr 
*Warframe supports HDR; is free, and very well optimized so it's easy to hit 144fps (and also a really good game).
*Someone suggested 120%scaling in windows - looks nice and crisp.
*With the windows SDR brightness slider, it seems acceptable to just leave hdr permanently on.
*Peak brightness is sometimes too bright. I think they simply did not consider how close you sit to a monitor compared to a tv where 1000 nits is ok.

I took a few pictures of HDR content with HDR enabled on the camera. Its not a 1:1 representation vs reality, but its pretty close. In one of them I put the mouse pointer at a worst case scenario. Did you even notice?
If you have a newer phone with an OLED screen, it should support HDR even if your monitor doesn't. (It seems imgur compressed the images... and ocn failed to accept them even though they were within allowed size, so maybe slightly degraded - ill see if i can find a proper way to host them later).

https://lensdump.com/i/8bLuge
https://lensdump.com/i/8bLNDx
https://lensdump.com/i/8bLD3k
edited to an image host with no compression.


----------



## deadchip12

Morkai said:


> The HDR implementation is excellent (but not perfect - it's better than any non-oled HDR tv I have seen in showrooms


That's a pretty bold claim, since Lim's Cave reviewed the Acer X27 and he said these monitors have more blooming and worse hdr image than Sony XE93, and that tv is mid-range edge lit with 60 zones and supposed to be inferior to X940E & Z9D. I hope you're right though. I really don't want to go for a TV unless the image quality is much better than monitor. 



Morkai said:


> It has been said before, but you only really see the haloing with the mousepointer on black, or the occational scene where a small lightsource is surrounded by black. (like a small burning torch in darkness in a movie). 99.9% of the time, its a non-issue, and this monitor is simply so much better than anything that has previously existed.


Please test the chess demo: 



 and see if the blooming is as bad as shown in the pic below












Morkai said:


> OLED is still better, but it's not by a lot


Do you see any color banding in games or movies? This monitor is 8bit+FRC so it should have more banding compared to OLED which is true 10 bit?



Morkai said:


> I took a few pictures of HDR content with HDR enabled on the camera. Its not a 1:1 representation vs reality, but its pretty close. In one of them I put the mouse pointer at a worst case scenario. Did you even notice?
> If you have a newer phone with an OLED screen, it should support HDR even if your monitor doesn't.
> 
> https://imgur.com/a/y90XoZL
> https://imgur.com/a/wOdWDfd
> https://imgur.com/a/ltYcFZW


I don't see any blooming in these pictures. What am I supposed to look at? The colors & luminance will not represent real life at all since the device you use to capture this picture most likely does not support HDR




Morkai said:


> *Chrome does play HDR content, but seems to adjust to hdr colorspace (ignores the windows sdr slider) and just looks too dim


This is a known issue. I think the new Windows update messed Chrome up. Try Chrome Beta.


----------



## Glerox

https://hothardware.com/reviews/asu...z-g-sync-monitor-review-true-hdr-gaming-at-4k

Edit: still waiting for a professionnal review...


----------



## acmilangr

Have anyone tried forza motorsport 7 on Hdr?
For some reason colors are washed out.


----------



## HyperMatrix

I just received my PG27UQ. While not as extreme as vega, I've been a monitor aficionado for a long time. For those who are looking to get feedback on this monitor from someone with a gaming background who's used the Korean ips 1440p monitors, followed by the acer 144hz and 165hz 1440p monitors, my opinions may be of some use. Before I start...any criticism I have does not detract from my desire to own this monitor. I have 0 buyer's remorse. I would buy it again 100x over. It's not perfect, but it is the best that technology can offer today.


- Text does indeed look terrible at 4:2:2. It's acceptable for short term use between gaming, but if you're going to be coding for a few hours, or posting/commenting online, you definitely need to drop the refresh rate and go back to 4:4:4 or RGB.

- The FALD zones have a big problem with tracking the mouse. Not in terms of speed, but in terms of intensity and zone transitioning. So when you're in the middle of a zone, you see a lot of backlight bloom. As you start moving away from the center, it starts to fade. When you're in the middle of 2 zones, it's much less bright and therefore much less bloom, but then as you go closer to the next zone, it suddenly goes full blast with bloom and all. So basically if you were to move your mouse horizontally across the screen, you'd see a "wave" like effect with the backlight, getting bright/dimming repeatedly. Part of this is due to the grid resolution of the FALD, but there's another underlying problem. Moving the mouse around on an edge lit Samsung tv ends up being smoother and less distracting than this FALD implementation. It's really quite poorly done, imo. I'm trying to explain it as best I can. Imagine a very poorly written application that lights up every time the mouse cursor gets close to one of its 384 zones. The overall effect is very distracting and is a greater upset because when, as I mentioned earlier, the mouse cursor is between 2 different zones, the bloom is almost non-existent. It only occurs as you get close to the center of a zone. Please note: This is a "worst case scenario" using a white mouse cursor on a full black screen. In games, I haven't noticed this effect at all. This is just commentary on the FALD tech used.


- HDR in Windows, in general, is convoluted. I hate it. Because it's so poorly organized. Some games have in-game switches for HDR. Others require you to enable HDR in Windows. Same goes for Netflix. And HDR settings per game are so varied. You have to tweak each one individually, both with the software settings, as well as the monitor OSD. I still haven't figured out how to run HDR properly on Injustice 2, for example. The entire thing is oversaturated with massive black crush regardless of whether I enabled RGB or YCbCr. I would have to mess with the NVidia Desktop color settings and adjust gamma/contrast/brightness. Other games like Final Fantasy or Far Cry 5 would work beautifully right out of the box (of course FF XV still requires you to set HDR on the desktop before loading the game). HDR on Windows is A MESS. Not a fault of this monitor. Although..there is one.


- Switching between HDR and SDR takes a long time. So if you're running HDR on the desktop and are playing an SDR game, tabbing out causes several seconds of wait as the monitor figures out how to transition between the 2. This will be a very noticeable and annoying feature. I haven't tested it but I'm assuming UWP style borderless fullscreen games won't trigger this. But then again in that situation I'm not sure how great an SDR game would look if running in a box on an HDR desktop. Assuming it'd be washed out.


- Fan noise. People are weird. It's a fan. Your computer has over 10 of them. If you can hear your fan, there's probably some issue. But between my speakers while I'm gaming, and my computer fans while I'm browsing/etc...I've never ever heard the fan. Only time I heard it was when I shut off the computer and my head was 2 feet away from it. This is a non-issue.


- The OSD joystick. Feels like it's the same type of joystick you'd expect to get out of a $1 fidget toy from (insert random Chinese etailer). It feels kind of loose. It's very weak and fragile. And mine, at least, has a tendency to stick at times. By far the worst designed aspect of the monitor. I'm 100% betting that this piece will break/stop working prior to the 3 year warranty being up, especially since you need to constantly use it because of lack of HDR implementation unity in windows/other games. 


- Color temperature. I'm disappointed here because it should be such an easy thing to implement correctly. The "warm" color temperature on this thing looks like crap. They should have used the opportunity to add more than 1 warm tone, similar to what Samsung does with their TVs. Fortunately you can adjust it manually yourself. But it's these little things that really stand out on a $2000 monitor.

- AG filter does a great disservice to this monitor. I took mine off and now have a lovely glossy display. It's a shame that there's no option for some sort of bonded glass glossy panel. I highly recommend you remove the AG filter. Talk to BlackVette94. I do NOT recommend this to beginners. You can very easily destroy the monitor by damaging the polarizer. Well I mean theoretically you could replace the polarizer. But i'm sure the cost of doing that would be far greater than the cost of having the AG matte filter removed professionally. 

- Retina blinding brightness. It's really a beautiful thing. Played a bunch of games with it including Far Cry 5, Forza 7, FF XV, Injustice 2. There's just something special about that level of localized radiance. Being able to adjust the brightness to set the lamp posts in Vampyr to have a certain level of brightness is just so fun. Honestly playing games through this monitor feels like the effect of going from Ultra graphics quality settings to Ultra xxx3. It's a very worthy upgrade. I was actually surprised at how many games could be played maxed out at 4k on a single video card. And mine's not even the latest and greatest like Vega's Titan V. I just have an old Titan X (Pascal) clocked at 2.1GHz. And it could handle everything at 60fps+. With the exception of FF XV. Playing that game maxed out dropped me down to the 35-55fps range. But since I play that game with a gamepad, and because the game looks so freaking incredible maxed out with HDR and 4K, I'm more than happy to play it like that. I mean I played it at 30fps when it came out on the PS4 Pro and that didn't even have proper frame pacing. So with GSYNC, it feels many times better. 

- 27" screen. You know, I had my reservations. A lot of people said 27" is too small for 4K. But once I played games that I had previously played on a 27" 1440p monitor, I came to one conclusion. Well. 2 conclusions. 4k at 27" is brilliant. It's amazing. But even it is not enough. At 32", you'd lose a lot of the pixel density that this screen offers, and pixel density plays a huge part in making what you're looking at seem "real." 27" is too small, however. That's true. But after trying this monitor out, if I had to pick between a 27" 4K and 32" 4k monitor, I don't know if I'd immediately go for the 32" option. I think the ultimate monitor will have to wait until 32" 8K 144Hz panels come out. But this is definitely a step in the right direction. 

- Anti Aliasing. Anyone who says you don't need AA on a 4k monitor is either blind, or high. Noticeable jaggies in games when I turned off temporal AA. Couldn't deal. Turned it back on, and it was golden again. If you've never cared about visual fidelity, then you may not care about AA. But proper AA is immersive as hell. And if you enjoyed AA on your 1440p 27" monitor, you'll still want it on your 4k 27" monitor. 

- Comparison to OLED. I admit I was wrong. This monitor can't even come close to touching OLED. The FALD is the primary factor holding this monitor back from being able to directly compete against OLED displays. The other factor being the lower end Quantum Dot implementation. If you don't have a need for 144Hz, GO WITH OLED. However...you'll still be faced with a problem with OLED. You can't find a 4k display around this size. So there are always compromises. The image quality on this monitor can't even compete against the new Samsung QLED TVs (which are amazing TVs, I should mention). Well it can compete, but it'll lose 10/10 times. And with 120Hz VRR available on the Samsung TVs, the only thing making this monitor and the eventual BFGD tvs a good choice, is the lack of VRR support by Nvidia. I love GSYNC. Don't get me wrong. But I find it's limited competition in the gaming display market. 

- Comparison to TVs. One thing I'm really...uhmm..not "upset" over, as I never expected it to be included, but wished it were, is the advanced interpolation used by TVs. Motion interpolation is an amazing thing, and can roughly simulate a 2x GPU power increase, by taking a constant 60fps signal, and inserting blended frames up to 120fps. With an advanced processing unit, more powerful than those used in TVs, this could be accomplished without all the artifacting that occurs on TVs. In fact this likely would have been easy (well...relatively) to implement given the $2000 FGPA unit used to power GSYNC/HDR. GSYNC already works by implementing a frame buffer. So how hard would it be to add 16ms input lag, which is more than acceptable, to offer that functionality? I think going forward, if Nvidia wants to be able to justify the price of its GSYNC units and differentiate itself from VRR, it needs to implement this. So you're getting frame syncing, along with interpolation, allowing you to get nearly 2x more "power" from your existing system. Anyway, a girl can hope. Even if that girl is actually a guy. 

I do highly recommend this monitor. As long as you have the hardware to truly utilize it. The price is honestly a bargain considering the tech inside it. There are so many things I wish it had. But this is the absolute highest end of display technology. And it's a real treat to play games on. 

Hope this will be helpful to some.


----------



## deadchip12

HyperMatrix said:


> I just received my PG27UQ. While not as extreme as vega, I've been a monitor aficionado for a long time. For those who are looking to get feedback on this monitor from someone with a gaming background who's used the Korean ips 1440p monitors, followed by the acer 144hz and 165hz 1440p monitors, my opinions may be of some use. Before I start...any criticism I have does not detract from my desire to own this monitor. I have 0 buyer's remorse. I would buy it again 100x over. It's not perfect, but it is the best that technology can offer today.
> 
> 
> - Text does indeed look terrible at 4:2:2. It's acceptable for short term use between gaming, but if you're going to be coding for a few hours, or posting/commenting online, you definitely need to drop the refresh rate and go back to 4:4:4 or RGB.
> 
> - The FALD zones have a big problem with tracking the mouse. Not in terms of speed, but in terms of intensity and zone transitioning. So when you're in the middle of a zone, you see a lot of backlight bloom. As you start moving away from the center, it starts to fade. When you're in the middle of 2 zones, it's much less bright and therefore much less bloom, but then as you go closer to the next zone, it suddenly goes full blast with bloom and all. So basically if you were to move your mouse horizontally across the screen, you'd see a "wave" like effect with the backlight, getting bright/dimming repeatedly. Part of this is due to the grid resolution of the FALD, but there's another underlying problem. Moving the mouse around on an edge lit Samsung tv ends up being smoother and less distracting than this FALD implementation. It's really quite poorly done, imo. I'm trying to explain it as best I can. Imagine a very poorly written application that lights up every time the mouse cursor gets close to one of its 384 zones. The overall effect is very distracting and is a greater upset because when, as I mentioned earlier, the mouse cursor is between 2 different zones, the bloom is almost non-existent. It only occurs as you get close to the center of a zone. Please note: This is a "worst case scenario" using a white mouse cursor on a full black screen. In games, I haven't noticed this effect at all. This is just commentary on the FALD tech used.
> 
> 
> - HDR in Windows, in general, is convoluted. I hate it. Because it's so poorly organized. Some games have in-game switches for HDR. Others require you to enable HDR in Windows. Same goes for Netflix. And HDR settings per game are so varied. You have to tweak each one individually, both with the software settings, as well as the monitor OSD. I still haven't figured out how to run HDR properly on Injustice 2, for example. The entire thing is oversaturated with massive black crush regardless of whether I enabled RGB or YCbCr. I would have to mess with the NVidia Desktop color settings and adjust gamma/contrast/brightness. Other games like Final Fantasy or Far Cry 5 would work beautifully right out of the box (of course FF XV still requires you to set HDR on the desktop before loading the game). HDR on Windows is A MESS. Not a fault of this monitor. Although..there is one.
> 
> 
> - Switching between HDR and SDR takes a long time. So if you're running HDR on the desktop and are playing an SDR game, tabbing out causes several seconds of wait as the monitor figures out how to transition between the 2. This will be a very noticeable and annoying feature. I haven't tested it but I'm assuming UWP style borderless fullscreen games won't trigger this. But then again in that situation I'm not sure how great an SDR game would look if running in a box on an HDR desktop. Assuming it'd be washed out.
> 
> 
> - Fan noise. People are weird. It's a fan. Your computer has over 10 of them. If you can hear your fan, there's probably some issue. But between my speakers while I'm gaming, and my computer fans while I'm browsing/etc...I've never ever heard the fan. Only time I heard it was when I shut off the computer and my head was 2 feet away from it. This is a non-issue.
> 
> 
> - The OSD joystick. Feels like it's the same type of joystick you'd expect to get out of a $1 fidget toy from (insert random Chinese etailer). It feels kind of loose. It's very weak and fragile. And mine, at least, has a tendency to stick at times. By far the worst designed aspect of the monitor. I'm 100% betting that this piece will break/stop working prior to the 3 year warranty being up, especially since you need to constantly use it because of lack of HDR implementation unity in windows/other games.
> 
> 
> - Color temperature. I'm disappointed here because it should be such an easy thing to implement correctly. The "warm" color temperature on this thing looks like crap. They should have used the opportunity to add more than 1 warm tone, similar to what Samsung does with their TVs. Fortunately you can adjust it manually yourself. But it's these little things that really stand out on a $2000 monitor.
> 
> - AG filter does a great disservice to this monitor. I took mine off and now have a lovely glossy display. It's a shame that there's no option for some sort of bonded glass glossy panel. I highly recommend you remove the AG filter. Talk to BlackVette94. I do NOT recommend this to beginners. You can very easily destroy the monitor by damaging the polarizer. Well I mean theoretically you could replace the polarizer. But i'm sure the cost of doing that would be far greater than the cost of having the AG matte filter removed professionally.
> 
> - Retina blinding brightness. It's really a beautiful thing. Played a bunch of games with it including Far Cry 5, Forza 7, FF XV, Injustice 2. There's just something special about that level of localized radiance. Being able to adjust the brightness to set the lamp posts in Vampyr to have a certain level of brightness is just so fun. Honestly playing games through this monitor feels like the effect of going from Ultra graphics quality settings to Ultra xxx3. It's a very worthy upgrade. I was actually surprised at how many games could be played maxed out at 4k on a single video card. And mine's not even the latest and greatest like Vega's Titan V. I just have an old Titan X (Pascal) clocked at 2.1GHz. And it could handle everything at 60fps+. With the exception of FF XV. Playing that game maxed out dropped me down to the 35-55fps range. But since I play that game with a gamepad, and because the game looks so freaking incredible maxed out with HDR and 4K, I'm more than happy to play it like that. I mean I played it at 30fps when it came out on the PS4 Pro and that didn't even have proper frame pacing. So with GSYNC, it feels many times better.
> 
> - 27" screen. You know, I had my reservations. A lot of people said 27" is too small for 4K. But once I played games that I had previously played on a 27" 1440p monitor, I came to one conclusion. Well. 2 conclusions. 4k at 27" is brilliant. It's amazing. But even it is not enough. At 32", you'd lose a lot of the pixel density that this screen offers, and pixel density plays a huge part in making what you're looking at seem "real." 27" is too small, however. That's true. But after trying this monitor out, if I had to pick between a 27" 4K and 32" 4k monitor, I don't know if I'd immediately go for the 32" option. I think the ultimate monitor will have to wait until 32" 8K 144Hz panels come out. But this is definitely a step in the right direction.
> 
> - Anti Aliasing. Anyone who says you don't need AA on a 4k monitor is either blind, or high. Noticeable jaggies in games when I turned off temporal AA. Couldn't deal. Turned it back on, and it was golden again. If you've never cared about visual fidelity, then you may not care about AA. But proper AA is immersive as hell. And if you enjoyed AA on your 1440p 27" monitor, you'll still want it on your 4k 27" monitor.
> 
> - Comparison to OLED. I admit I was wrong. This monitor can't even come close to touching OLED. The FALD is the primary factor holding this monitor back from being able to directly compete against OLED displays. The other factor being the lower end Quantum Dot implementation. If you don't have a need for 144Hz, GO WITH OLED. However...you'll still be faced with a problem with OLED. You can't find a 4k display around this size. So there are always compromises. The image quality on this monitor can't even compete against the new Samsung QLED TVs (which are amazing TVs, I should mention). Well it can compete, but it'll lose 10/10 times. And with 120Hz VRR available on the Samsung TVs, the only thing making this monitor and the eventual BFGD tvs a good choice, is the lack of VRR support by Nvidia. I love GSYNC. Don't get me wrong. But I find it's limited competition in the gaming display market.
> 
> - Comparison to TVs. One thing I'm really...uhmm..not "upset" over, as I never expected it to be included, but wished it were, is the advanced interpolation used by TVs. Motion interpolation is an amazing thing, and can roughly simulate a 2x GPU power increase, by taking a constant 60fps signal, and inserting blended frames up to 120fps. With an advanced processing unit, more powerful than those used in TVs, this could be accomplished without all the artifacting that occurs on TVs. In fact this likely would have been easy (well...relatively) to implement given the $2000 FGPA unit used to power GSYNC/HDR. GSYNC already works by implementing a frame buffer. So how hard would it be to add 16ms input lag, which is more than acceptable, to offer that functionality? I think going forward, if Nvidia wants to be able to justify the price of its GSYNC units and differentiate itself from VRR, it needs to implement this. So you're getting frame syncing, along with interpolation, allowing you to get nearly 2x more "power" from your existing system. Anyway, a girl can hope. Even if that girl is actually a guy.
> 
> I do highly recommend this monitor. As long as you have the hardware to truly utilize it. The price is honestly a bargain considering the tech inside it. There are so many things I wish it had. But this is the absolute highest end of display technology. And it's a real treat to play games on.
> 
> Hope this will be helpful to some.


It's funny hearing some say this monitor is almost oled level with great fald, while others say it cannot even touch other led tvs at lower price. Don't know whom to believe. Guess I need to wait and test it myself


----------



## Desolutional

HyperMatrix said:


> Anti Aliasing. Anyone who says you don't need AA on a 4k monitor is either blind, or high. Noticeable jaggies in games when I turned off temporal AA. Couldn't deal. Turned it back on, and it was golden again. If you've never cared about visual fidelity, then you may not care about AA. But proper AA is immersive as hell. And if you enjoyed AA on your 1440p 27" monitor, you'll still want it on your 4k 27" monitor.


How close do you sit to the monitor? Should be using 150% scaling at a healthy distance. The little difference that AA makes on mine doesn't seem worth it when playing action heavy games, I'd rather have more frames. And regarding the immersion comment, VR is proper immersion, at least once they fix the screendoor effect.


----------



## HyperMatrix

deadchip12 said:


> It's funny hearing some say this monitor is almost oled level with great fald, while others say it cannot even touch other led tvs at lower price. Don't know whom to believe. Guess I need to wait and test it myself


It's like a poor man's OLED. It's quite good. Don't get me wrong. But OLED is something else. I was on the bandwagon of those who thought that, at least on paper, this monitor should be close to the performance of an OLED display. But I made my assumptions based on "Best In Tech" TVs. I guarantee that if this monitor were made by Samsung, it would be far closer to OLED levels. The FALD is what holds it back. The FALD bloom, which is very noticeable compared to the systems implemented by Samsung in the QF9N, destroys the immersion. In the LG Chess Demo where the scene is generally black but you have the flame at the bottom and the little sparkly embers going up, you can see how much FALD bloom there is, and then compare that to any OLED panel. Even on a phone like the Galaxy s9 or iPhone X. You'll Immediately see the difference. In scenes where there are uniform black areas without any interruption, you'll get a much closer to OLED visual on this monitor. But anything with mixed dark/light content, and you'll realize where it's limitations are compared to OLED.


----------



## HyperMatrix

Desolutional said:


> How close do you sit to the monitor? Should be using 150% scaling at a healthy distance. The little difference that AA makes on mine doesn't seem worth it when playing action heavy games, I'd rather have more frames. And regarding the immersion comment, VR is proper immersion, at least once they fix the screendoor effect.


about 2 to 2.5 feet from the monitor. Scaling doesn't matter in games. FOV would be the closest thing to scaling. Look at chain link fences in games at any distance and you'll see the jaggies and artifacting that occurs.


----------



## deadchip12

HyperMatrix said:


> deadchip12 said:
> 
> 
> 
> It's funny hearing some say this monitor is almost oled level with great fald, while others say it cannot even touch other led tvs at lower price. Don't know whom to believe. Guess I need to wait and test it myself
> 
> 
> 
> It's like a poor man's OLED. It's quite good. Don't get me wrong. But OLED is something else. I was on the bandwagon of those who thought that, at least on paper, this monitor should be close to the performance of an OLED display. But I made my assumptions based on "Best In Tech" TVs. I guarantee that if this monitor were made by Samsung, it would be far closer to OLED levels. The FALD is what holds it back. The FALD bloom, which is very noticeable compared to the systems implemented by Samsung in the QF9N, destroys the immersion. In the LG Chess Demo where the scene is generally black but you have the flame at the bottom and the little sparkly embers going up, you can see how much FALD bloom there is, and then compare that to any OLED panel. Even on a phone like the Galaxy s9 or iPhone X. You'll Immediately see the difference. In scenes where there are uniform black areas without any interruption, you'll get a much closer to OLED visual on this monitor. But anything with mixed dark/light content, and you'll realize where it's limitations are compared to OLED.
Click to expand...

Hmm yeah I read Q9FN fald is very aggressive which minimizes blooming. 

Is the blooming on the monitor much less visible if the room has some lighting?

I'm actually considering between this monitor and Sony XE93. I just bought Oled c7 but I don't want to use it for gaming because of burn in. XE93 costs half of this monitor at my country, and some say it has less blooming and better hdr than this monitor even though it's only edge lit 60 dimming zones. Which one do you think I should buy? I don't care much about high refresh rate, gsync is nice but I think I can live without it in exchange for better image quality


----------



## HyperMatrix

deadchip12 said:


> Hmm yeah I read Q9FN fald is very aggressive which minimizes blooming.
> 
> Is the blooming on the monitor much less visible if the room has some lighting?
> 
> I'm actually considering between this monitor and Sony XE93. I just bought Oled c7 but I don't want to use it for gaming because of burn in. XE93 costs half of this monitor at my country, and some say it has less blooming and better hdr than this monitor even though it's only edge lit 60 dimming zones. Which one do you think I should buy? I don't care much about high refresh rate, gsync is nice but I think I can live without it in exchange for better image quality


The blooming is visible only when you have a bright object in a dark area. So when I'm playing Vampyr, the lamp post will have a bluish bloom to it. It would be more forgivable if the bloom was a warmer color. The visibility of the bloom depends on how bright you set the monitor itself. I personally love retina blinding brightness. Which makes blooming far more visible. If it done at much more reasonable brightness levels, in a room with ambient light, then it wouldn't be very visible at all. No matter what the brightness level, black will always just be black. Increasing the brights however, causes excess bloom everywhere. I'm in the process of setting up a 600-LED lightpack as an immersive bias lighting mechanism to enhance the monitor even further. 

I've watched some HDR Netflix movies on here and it's been more than acceptable. If you're coming from LCD panels to this monitor, you'll love it 100% . If you're coming from an OLED, you're going to be aware of the tradeoff. You can minimize it, but that absolutely magical self emitting pixel technology of OLED is really something else. Think of the monitor as being a step or two behind the QF9N, but with the added benefit of GSYNC/144Hz. If you're happy with that tradeoff, then you'll be happy with the monitor. But if you go into it expecting it to look like OLED, you'll be disappointed. And this is coming from someone who's taken the matte anti-glare film off the monitor as well. So I'm looking at a pure glossy display as I'm mentioning all of this.


----------



## moonbogg

If the monitor blooms, doesn't that wash out the colors surrounding the bright spot? People say you only see the bloom with something like a mouse cursor over a black background, but on colored content you don't see it. My question is that since the backlight is still bleeding through, wouldn't that washout the colors instead of blooming? Its not like the backlight bleed is gone just because you don't have a white cursor over black background. The backlight would still bloom, washing out the colors, right? Also, I thought "mixed dark/light content" was what HDR was all about, so this is discouraging. 
Is it reasonable to expect fine HDR detail when the backlights have an entire inch of space between them? I'd think something like a max of 1/16 inch space between backlights would be more ideal. I have a feeling that "more FALD zones" will be a big marketing checkbox for gaming monitors for years to come. "Now with 1000 zones!" "Now with 2000 zones!!!" etc.


----------



## deadchip12

HyperMatrix said:


> deadchip12 said:
> 
> 
> 
> Hmm yeah I read Q9FN fald is very aggressive which minimizes blooming.
> 
> Is the blooming on the monitor much less visible if the room has some lighting?
> 
> I'm actually considering between this monitor and Sony XE93. I just bought Oled c7 but I don't want to use it for gaming because of burn in. XE93 costs half of this monitor at my country, and some say it has less blooming and better hdr than this monitor even though it's only edge lit 60 dimming zones. Which one do you think I should buy? I don't care much about high refresh rate, gsync is nice but I think I can live without it in exchange for better image quality
> 
> 
> 
> The blooming is visible only when you have a bright object in a dark area. So when I'm playing Vampyr, the lamp post will have a bluish bloom to it. It would be more forgivable if the bloom was a warmer color. The visibility of the bloom depends on how bright you set the monitor itself. I personally love retina blinding brightness. Which makes blooming far more visible. If it done at much more reasonable brightness levels, in a room with ambient light, then it wouldn't be very visible at all. No matter what the brightness level, black will always just be black. Increasing the brights however, causes excess bloom everywhere. I'm in the process of setting up a 600-LED lightpack as an immersive bias lighting mechanism to enhance the monitor even further.
> 
> I've watched some HDR Netflix movies on here and it's been more than acceptable. If you're coming from LCD panels to this monitor, you'll love it 100% . If you're coming from an OLED, you're going to be aware of the tradeoff. You can minimize it, but that absolutely magical self emitting pixel technology of OLED is really something else. Think of the monitor as being a step or two behind the QF9N, but with the added benefit of GSYNC/144Hz. If you're happy with that tradeoff, then you'll be happy with the monitor. But if you go into it expecting it to look like OLED, you'll be disappointed. And this is coming from someone who's taken the matte anti-glare film off the monitor as well. So I'm looking at a pure glossy display as I'm mentioning all of this.
Click to expand...

I certainly do not expect oled level of zonal control, but hearing people say the fald implementation and blooming is worse than other cheaper lcd tvs is disheartening, making the purchase of this monitor a very hard decision.

As for the q9fn, from the reviews, it seems you see much less blooming because the fald algoritm is way too aggressive and may lead to losing shadow details. Also, in the zone count video, though exaggerating blooming, the blooming still looks pretty bad and I can't fathom how pg27uq fald can be worse than this


----------



## NewType88

lumbeechief said:


> What the fuc* is going on around here??? The Newegg released date changed from "Release Date: 7/13/2018" to "OUT OF STOCK. ETA: 8/9/2018" just now. I just want the actual USA release date of this damn monitor already, not preorder bullshi*! Hell, the monitor isn't even listed on Amazon nor fulfilled by them with prime shipping.


There’s over 10 units at my micro center for $1799.00. See if you can order it online from them.


----------



## HyperMatrix

deadchip12 said:


> I certainly do not expect oled level of zonal control, but hearing people say the fald implementation and blooming is worse than other cheaper lcd tvs is disheartening, making the purchase of this monitor a very hard decision.
> 
> As for the q9fn, from the reviews, it seems you see much less blooming because the fald algoritm is way too aggressive and may lead to losing shadow details. Also, in the zone count video, though exaggerating blooming, the blooming still looks pretty bad and I can't fathom how pg27uq fald can be worse than this


The FALD implementation is arguably worse than even some good Edge lit TV implementations to be honest with you. As for why the Samsung has an edge, it's partly because samsung uses an additional ultra black layer over top of the LCD. This reduces overall brightness a little, but increases contrast, and helps hide some low level backlight bloom. As for the picture you attached of the TV being completely lit up, that's a dishonest picture that was taken with increased exposure to basically highlight how the backlight is working, as opposed to how it looks to your eyes. Give me a nearly pitch black room and a tripod and I'll make the picture come out looking like it was taken in a light factory.


----------



## deadchip12

Glerox said:


> https://hothardware.com/reviews/asu...z-g-sync-monitor-review-true-hdr-gaming-at-4k
> 
> Edit: still waiting for a professionnal review...


I think this one is pretty thorough and technical?


----------



## NewType88

@HyperMatrix Hypothetically, would it be possible for a firmware update to improve falds response ? Or is it strictly a hardware issue ?


----------



## acmilangr

NewType88 said:


> @HyperMatrix Hypothetically, would it be possible for a firmware update to improve falds response ? Or is it strictly a hardware issue ?


that is a nice question


----------



## HyperMatrix

NewType88 said:


> @HyperMatrix Hypothetically, would it be possible for a firmware update to improve falds response ? Or is it strictly a hardware issue ?


Well they can't patch in another darkening layer on top of the monitor similar to the Samsung Ultra Black panels. But as deadchip said, Samsung manages better black levels by being overly aggressive with the FALD. You may miss out on some details, but overall I think it's a better implementation. As for whether it can be improved through firmware, the answer would be a maybe. If the question is whether the issue is an unresolvable hardware limitation, the answer would be no. It's definitely fixable. But whether we'll be able to fix it or not is another question. And I'd think that would be a tad more problematic. The reason I say that is that the firmware update that they mentioned would be usable by everybody without any special hardware. Presumably through the USB port. But the main programming port on the back of the monitor, which likely controls the majority of the FPGA software, would cast doubts as to the limits of what can be done through the USB port. Because if everything could be programmed through the USB port, why would that port need to exist? Unless it's just as a backup in case the software that controls the USB firmware update fails/bricks the monitor.


But to answer your question in a shorter manner, I'd have to say yes in that it's hypothetically possible for them to update the FALD response algorithm. And no in terms of whether it's likely that they'd do it as they're already moving on to development of the 32" 4K with MiniLED FALD. All the development of these monitors was done by AU Optronics and Nvidia. Not ASUS/Acer. That's why they're pretty much identical in every single way minus some superficial things. So it would be up to one of the former to fix it. I don't think AU Optronics cares about developing the 384 zone backlight system any further. And I don't think Nvidia has the capability to deliver meaningful changes in any reasonable amount of time, considering how these monitors were delayed 2 years as is (although, the delay was likely due to lack of appealing FPGA price/performance metrics at the time)

Update: I've attached 2 pictures taken with my phone to see if I can show what I mean about the FALD problem. They're both taken at about the exact same place (less than an inch away). Same exposure and ISO setting. Only thing different is whether the cursor is directly on a FALD zone, or in between 2. When it's in between 2, it outputs a far lesser amount of light, allowing the cursor to still be lit properly, without blooming all over the place.


----------



## deadchip12

HyperMatrix said:


> NewType88 said:
> 
> 
> 
> @HyperMatrix Hypothetically, would it be possible for a firmware update to improve falds response ? Or is it strictly a hardware issue ?
> 
> 
> 
> Well they can't patch in another darkening layer on top of the monitor similar to the Samsung Ultra Black panels. But as deadchip said, Samsung manages better black levels by being overly aggressive with the FALD. You may miss out on some details, but overall I think it's a better implementation. As for whether it can be improved through firmware, the answer would be a maybe. If the question is whether the issue is an unresolvable hardware limitation, the answer would be no. It's definitely fixable. But whether we'll be able to fix it or not is another question. And I'd think that would be a tad more problematic. The reason I say that is that the firmware update that they mentioned would be usable by everybody without any special hardware. Presumably through the USB port. But the main programming port on the back of the monitor, which likely controls the majority of the FPGA software, would cast doubts as to the limits of what can be done through the USB port. Because if everything could be programmed through the USB port, why would that port need to exist? Unless it's just as a backup in case the software that controls the USB firmware update fails/bricks the monitor.
> 
> 
> But to answer your question in a shorter manner, I'd have to say yes in that it's hypothetically possible for them to update the FALD response algorithm. And no in terms of whether it's likely that they'd do it as they're already moving on to development of the 32" 4K with MiniLED FALD. All the development of these monitors was done by AU Optronics and Nvidia. Not ASUS/Acer. That's why they're pretty much identical in every single way minus some superficial things. So it would be up to one of the former to fix it. I don't think AU Optronics cares about developing the 384 zone backlight system any further. And I don't think Nvidia has the capability to deliver meaningful changes in any reasonable amount of time, considering how these monitors were delayed 2 years as is (although, the delay was likely due to lack of appealing FPGA price/performance metrics at the time)
> 
> Update: I've attached 2 pictures taken with my phone to see if I can show what I mean about the FALD problem. They're both taken at about the exact same place (less than an inch away). Same exposure and ISO setting. Only thing different is whether the cursor is directly on a FALD zone, or in between 2. When it's in between 2, it outputs a far lesser amount of light, allowing the cursor to still be lit properly, without blooming all over the place.
Click to expand...

Lol that glossy screen man. Very hard to see the blooming. Pls take the picture in pitch black room


----------



## Sichtwechsel86

deadchip12 said:


> Lol that glossy screen man. Very hard to see the blooming. Pls take the picture in pitch black room


are you still looking out for pictures that demonstrate blooming/haloing due to FALD? 

even the best picture will not give you the impression you would have with your own eyes - because due to physiological differences everyone will see it a bit different...

go - check it out for yourself - 
it won't get you any further to look at 1000 pictures of different haloing-situations!


----------



## NewType88

@HyperMatrix nice response, thanks. I noticed the zones light up when putting the mouse to the edge of the display in a very lit room. I was hoping it could be smoothed out from a firmware update. Also if it wasn't for the announcement of the miniLED that 'should' be coming out this time next year I would of bought one (x27). I just can't stomach spending 2k for something that will be significally cheaper when the newer models come out, potentially just a year away - not to mention its already being discounted at micro center. The agony of first world problems ! lol.


----------



## HyperMatrix

deadchip12 said:


> Lol that glossy screen man. Very hard to see the blooming. Pls take the picture in pitch black room


I'll have to wait until midnight. I've set up my rig in my bonus room. So there's plenty of light. And the monitor is literally like a mirror.


----------



## deadchip12

Sichtwechsel86 said:


> deadchip12 said:
> 
> 
> 
> Lol that glossy screen man. Very hard to see the blooming. Pls take the picture in pitch black room
> 
> 
> 
> are you still looking out for pictures that demonstrate blooming/haloing due to FALD?
> 
> even the best picture will not give you the impression you would have with your own eyes - because due to physiological differences everyone will see it a bit different...
> 
> go - check it out for yourself -
> it won't get you any further to look at 1000 pictures of different haloing-situations!
> 
> /forum/images/smilies/smile.gif
Click to expand...

I'll check it out myself...when my country import the damn monitor. It's nowhere to be found here lol. 

Meanwhile I'm just impatient and want to collect people's impressions as much as possible. It may be like you say though that everyone will see it a bit different...some say blooming is rare, some say it's all over the place...some say it matches oled, some say this is trashier than edge lit tv...so many contradicting opinions.


----------



## acmilangr

https://www.asus.com/us/Monitors/ROG-SWIFT-PG27UQ/

they have added "ULTRA HD PREMIUM" logo


----------



## Vegtro

Morkai said:


> *Warframe supports HDR; is free, and very well optimized so it's easy to hit 144fps (and also a really good game).


Warframe doesn't support true HDR. The game uses the fake HDR ala graphical enhancement.


----------



## acmilangr

So
Forza motorsport 7 doesnt work on HDR. it has washed out colors when HDR is activated. 

Injustice 2 exactly the same 

Assassin's creed origins doesnt also work. Very bad colors and very dark. 

Far cry 5 too brightness in evereywhere overall. Not good

Hitman crashes when i activate HDR. 

Only mass effect andromeda works perfect!

What about your test?


----------



## Morkai

deadchip12 said:


> I don't see any blooming in these pictures. What am I supposed to look at? The colors & luminance will not represent real life at all since the device you use to capture this picture most likely does not support HDR


Found a proper host. comments:

https://lensdump.com/i/8bLuge
This scene looks subjectively pretty amazing, imo. The pictures, as I mentioned, are taken with HDR enabled and are of course a bit degraded, but not too far off how it really looks like. 

While it subjectively looks amazing, technically it is a total failure due to blooming. As you say yourself "I don't see any blooming in these pictures." Check the bars and window separator that make up most of the center view. Block out the bright light with your fingers and you'll see that blooming from the bright light turns every bar/separator bright gray (aka ips-black..  ) You brain will tell you that they are black due to the contrasting lightsource, until you separate it.
It is not noticeable in normal use however, and same goes for 99.9% of all content. It has a weakness for pinpoint lightsources on dark backgrounds (starry skies, black car with headlights, sparks flying all over in the lg oled demo, mousecursor on black) - in all other situations the illusion of bright on dark contrast makes you not notice.
The HDR implementation on this monitor really is great overall, but it just physically can't display those things well due to zone size.

https://lensdump.com/i/8bLNDx
Here, it handles everything perfectly, and it just looks great.

https://lensdump.com/i/8bLD3k
The mouse pointer is in the bottom middle, near the border between light and dark. Does not create any visible haloing, again due to the strong contrast of the nearby lightsource.


"That's a pretty bold claim, since Lim's Cave reviewed the Acer X27 and he said these monitors have more blooming and worse hdr image than Sony XE93, and that tv is mid-range edge lit with 60 zones and supposed to be inferior to X940E & Z9D. I hope you're right though. I really don't want to go for a TV unless the image quality is much better than monitor. "
It's probably just a matter of what he is used to. There is no chance a 60 zone VA is better. A VA with 384 zones would probably be better at hiding haloing, but you'd lose the IPS perks, and this monitor does look really bright crisp and sharp due to that. I think at equal number of zones, it'd be down to VA vs IPS preference. No current tv, including oled, comes close in quality for fast moving gaming content due to slow refresh and lack of gsync/vrr. 


"Do you see any color banding in games or movies? This monitor is 8bit+FRC so it should have more banding compared to OLED which is true 10 bit?"
I downloaded a 10bit banding test from avsforums (spears_munsil_quantization), and no visible banding. Haven't noticed any in real world scenarios either.

"Please test the chess demo: 



 and see if the blooming is as bad as shown in the pic below"
There is noticeable blooming when the sparks start flying at 1:10 or so, but the rest is pretty much perfect. It is a worst case scenario though. Do note that he chose a LG OLED demo movie designed to bring out OLED strengths, it rarely happens in real content. Almost all things he put it trough, including the haloing background images, were worse case scenarios.
This video, 



 , is more of a best case scenario and looks stunning from start to finish, really brings out the ips qdot colors and high brightness strengths. There is a sunset about 1:50 in that looks amazing, I had to rewatch it several times. The logo in the corner halos in that scene though, no way around it. I think in this video it could score equally to an oled in a blindtest, or even win, depending on lightning conditions.


----------



## deadchip12

Morkai said:


> Found a proper host. comments:
> 
> https://lensdump.com/i/8bLuge
> This scene looks subjectively pretty amazing, imo. The pictures, as I mentioned, are taken with HDR enabled and are of course a bit degraded, but not too far off how it really looks like.
> 
> While it subjectively looks amazing, technically it is a total failure due to blooming. As you say yourself "I don't see any blooming in these pictures." Check the bars and window separator that make up most of the center view. Block out the bright light with your fingers and you'll see that blooming from the bright light turns every bar/separator bright gray (aka ips-black..  ) You brain will tell you that they are black due to the contrasting lightsource, until you separate it.
> It is not noticeable in normal use however, and same goes for 99.9% of all content. It has a weakness for pinpoint lightsources on dark backgrounds (starry skies, black car with headlights, sparks flying all over in the lg oled demo, mousecursor on black) - in all other situations the illusion of bright on dark contrast makes you not notice.
> The HDR implementation on this monitor really is great overall, but it just physically can't display those things well due to zone size.
> 
> https://lensdump.com/i/8bLNDx
> Here, it handles everything perfectly, and it just looks great.
> 
> https://lensdump.com/i/8bLD3k
> The mouse pointer is in the bottom middle, near the border between light and dark. Does not create any visible haloing, again due to the strong contrast of the nearby lightsource.
> 
> 
> "That's a pretty bold claim, since Lim's Cave reviewed the Acer X27 and he said these monitors have more blooming and worse hdr image than Sony XE93, and that tv is mid-range edge lit with 60 zones and supposed to be inferior to X940E & Z9D. I hope you're right though. I really don't want to go for a TV unless the image quality is much better than monitor. "
> It's probably just a matter of what he is used to. There is no chance a 60 zone VA is better. A VA with 384 zones would probably be better at hiding haloing, but you'd lose the IPS perks, and this monitor does look really bright crisp and sharp due to that. I think at equal number of zones, it'd be down to VA vs IPS preference. No current tv, including oled, comes close in quality for fast moving gaming content due to slow refresh and lack of gsync/vrr.
> 
> 
> "Do you see any color banding in games or movies? This monitor is 8bit+FRC so it should have more banding compared to OLED which is true 10 bit?"
> I downloaded a 10bit banding test from avsforums (spears_munsil_quantization), and no visible banding. Haven't noticed any in real world scenarios either.
> 
> "Please test the chess demo: https://www.youtube.com/watch?v=2RIDhA9c8qw and see if the blooming is as bad as shown in the pic below"
> There is noticeable blooming when the sparks start flying at 1:10 or so, but the rest is pretty much perfect. It is a worst case scenario though. Do note that he chose a LG OLED demo movie designed to bring out OLED strengths, it rarely happens in real content. Almost all things he put it trough, including the haloing background images, were worse case scenarios.
> This video, https://www.youtube.com/watch?v=LXb3EKWsInQ , is more of a best case scenario and looks stunning from start to finish, really brings out the ips qdot colors and high brightness strengths. There is a sunset about 1:50 in that looks amazing, I had to rewatch it several times. The logo in the corner halos in that scene though, no way around it. I think in this video it could score equally to an oled in a blindtest, or even win, depending on lightning conditions.


Thanks for taking your time to reply in details.

As for the window scene, are you sure that window and separators supposed to remain pitch black, or maybe they are supposed to be a bit grey in real life (due to strong light rays bleeding through? Idk, that's how I imagine it would look like in real life)? Did you test the same scene on oled?


----------



## Babryn25

Third time is a charm! Finally a screen with no problems. I quickly tried Destiny 2 in HDR 120hz RGB444 8bpc and got bad black crush. I thought this was problem with 4:2:2 only... Whats the deal here?
PS 144hz is not enabled.


----------



## HyperMatrix

acmilangr said:


> So
> Forza motorsport 7 doesnt work on HDR. it has washed out colors when HDR is activated.
> 
> Injustice 2 exactly the same
> 
> Assassin's creed origins doesnt also work. Very bad colors and very dark.
> 
> Far cry 5 too brightness in evereywhere overall. Not good
> 
> Hitman crashes when i activate HDR.
> 
> Only mass effect andromeda works perfect!
> 
> What about your test?


You must be doing something wrong. Injustice 2 is dark, yeah, due to a lack of proper in-game HDR adjustment settings. Most have a paper white calibration setting, but Injustice 2 doesn't. So you have to set up a secondary gaming profile on the monitor itself and up the paper white brightness setting in there. But everything else runs perfectly without any adjustments outside of the HDR settings in the game. 

See pics here:

https://imgur.com/a/jlQLrLS


Taken with an iPhone so pics can look a bit more contrasty than what the monitor actually looks like. But as you can see nothing is washed out.


----------



## BoredErica

So...
Is there a general vibe as to whether the monitor is good or not? Good QC or not? How does this compare to the infamous 27in IPS 144hz Gsync monitors in that department? 

I mostly play Bethesda games so most of the time I don't go much above 100fps. Gsync would be nice though, for some parts of Oblivion as FPS dips. OLED TVs are just too problematic. It's way too large and I don't know how to make it fit on my computer desk and room. It also lacks Gsync. And there are the decent rumblings about burn in. Yeah, burn in again, when it seemed like it was a none issue. (For example, Rtings burn in test.) Most of my time on my desktop is spent on the desktop with the task bar. It might sit there like that for many hours a day.

I might just buy one of those infamous 1440p 27in monitors. Maybe bad QC, but then again, maybe my super old Catleap has poor QC and I just can't see it for some reason. Otherwise I'll be stuck on DVID for like half a decade.


----------



## Morkai

deadchip12 said:


> Thanks for taking your time to reply in details.
> 
> As for the window scene, are you sure that window and separators supposed to remain pitch black, or maybe they are supposed to be a bit grey in real life (due to strong light rays bleeding through? Idk, that's how I imagine it would look like in real life)? Did you test the same scene on oled?


They are dark grey on oled, but if you zoom on the photo you'll see that the blooming makes it inconsistent shades of gray. Around the brightest light it is extremely bright grey due to haloing. In areas with dimmer light it is dark grey.


----------



## profundido

acmilangr said:


> So
> Forza motorsport 7 doesnt work on HDR. it has washed out colors when HDR is activated.
> 
> Injustice 2 exactly the same
> 
> Assassin's creed origins doesnt also work. Very bad colors and very dark.
> 
> Far cry 5 too brightness in evereywhere overall. Not good
> 
> Hitman crashes when i activate HDR.
> 
> Only mass effect andromeda works perfect!
> 
> What about your test?


heyy,

when I read this I really feel your monitor & software is not properly configured. After tuning I found Assassin's creed origins to look amazing. HDR in Hitman wasn't a problem for me either and looked equally amazing. Your other games I don't have so can't test


----------



## acmilangr

HyperMatrix said:


> You must be doing something wrong. Injustice 2 is dark, yeah, due to a lack of proper in-game HDR adjustment settings. Most have a paper white calibration setting, but Injustice 2 doesn't. So you have to set up a secondary gaming profile on the monitor itself and up the paper white brightness setting in there. But everything else runs perfectly without any adjustments outside of the HDR settings in the game.
> 
> See pics here:
> 
> https://imgur.com/a/jlQLrLS
> 
> 
> Taken with an iPhone so pics can look a bit more contrasty than what the monitor actually looks like. But as you can see nothing is washed out.


Hello and thanks for the response. 

What do you mean that i have to set up a secondary gaming profile on the monitor itself?

Just for the history, i did a Clear Windows installation. But my problem still exist.

HDR Movies works perfect. But on most games colors are washed out. I See that HDR on osd is enabled, the brightness is good (i can see easy that it is high) but the colors are bad.


----------



## deadchip12

Another review of PG27UQ: https://www.trustedreviews.com/reviews/asus-pg27uq

Contrast in SDR mode with local dimming on rivals good VA panel, which I guess is acceptable for an IPS panel. No contrast figure for HDR, shame.

Seems there are a bit of misinformation: *"it should be noted that this display can’t deliver true 10-bit colour HDR at above 98Hz, but instead drops to an 8-bit with dithering version of HDR for higher frame rates."* The display is always 8-bit with dithering, what they want to mention here is chroma subsampling

Still waiting for TFTCentral & PCMonitors reviews.


----------



## acmilangr

profundido said:


> heyy,
> 
> when I read this I really feel your monitor & software is not properly configured. After tuning I found Assassin's creed origins to look amazing. HDR in Hitman wasn't a problem for me either and looked equally amazing. Your other games I don't have so can't test


So What wrong am i doing?


----------



## HyperMatrix

acmilangr said:


> Hello and thanks for the response.
> 
> What do you mean that i have to set up a secondary gaming profile on the monitor itself?
> 
> Just for the history, i did a Clear Windows installation. But my problem still exist.
> 
> HDR Movies works perfect. But on most games colors are washed out. I See that HDR on osd is enabled, the brightness is good (i can see easy that it is high) but the colors are bad.


I have the monitor set to YCbCr 444 8-bit at 120Hz through Nvidia control panel. HDR/WCG set to On in windows settings. And everything works great. Not using digital vibrance or any saturation adjustments. Just the in-game settings. On the monitor I have the contrast set to 53, and the white point brightness at 70 for all games (except injustice). Nothing fancy.


----------



## HyperMatrix

deadchip12 said:


> Another review of PG27UQ: https://www.trustedreviews.com/reviews/asus-pg27uq
> 
> Contrast in SDR mode with local dimming on rivals good VA panel, which I guess is acceptable for an IPS panel. No contrast figure for HDR, shame.
> 
> Seems there are a bit of misinformation: *"it should be noted that this display can’t deliver true 10-bit colour HDR at above 98Hz, but instead drops to an 8-bit with dithering version of HDR for higher frame rates."* The display is always 8-bit with dithering, what they want to mention here is chroma subsampling
> 
> Still waiting for TFTCentral & PCMonitors reviews.


What do you expect from an idiot who puts this under the "Cons" list for a monitor:

"4K resolution too demanding"


----------



## kot0005

whats HDR and 10bit gotta do with Higher framerates lol ? ….


----------



## kot0005

acmilangr said:


> So What wrong am i doing?


DO you have a Titan V ?


----------



## acmilangr

kot0005 said:


> DO you have a Titan V ?


No. 1080ti


----------



## acmilangr

HyperMatrix said:


> I have the monitor set to YCbCr 444 8-bit at 120Hz through Nvidia control panel. HDR/WCG set to On in windows settings. And everything works great. Not using digital vibrance or any saturation adjustments. Just the in-game settings. On the monitor I have the contrast set to 53, and the white point brightness at 70 for all games (except injustice). Nothing fancy.


I will try these settings even i think i have tried all combinations. 

On nvidia panel if I choose 444 or 422 it has "limited" option. "full" option is only when i choose "RGB" is that nornal?


----------



## deadchip12

lol anyone remembers that besides Asus and Acer, AOC is also making a similar monitor i.e. AGON AG273UG? Where is it?


----------



## profundido

acmilangr said:


> So What wrong am i doing?


It's impossible to pinpoint the exact problem without having all the data of your settings first so I'm just gonna tell you my settings and things that pop into my head to check off your mental list:

-Windows 10 lastest major build with all latest updates and latest nvidia driver

-I use 120hz RGB 8-bit, HDR ON in windows

-in the monitor OSD make sure you use the correct FALD mode for the type of content ('gaming' for dark and fast changing environments on a X27 for example)

-In the monitor OSD max nits to 80-100, not more

-In the monitor OSD user color to custom. The existing profiles are off (and too bright for me) at least in X27 with original firmware so I had to adjust to make white white and not yellow.

-All nvidia settings default as a reference point to start your testing 

-in Assassin origins advanced video settings make sure you configure the maximum brightness reference point to exactly 1000nits, not less not more because the game calculates all the rest off of it. If this is wrong the whole game looks like garbage. Then bring out a torch and swap between FALD modes on the monitor OSD. Reason is there is a bug where if you are in the wrong FALD mode it gets 'stuck' and makes the torch flame look like garbage. Switching fald modes makes it 'unstuck'

Let us know how it goes plz


----------



## kot0005

acmilangr said:


> No. 1080ti


hmm no idea then can you take comparision photos ?

did u uninstall the drivers with DDU before installing new ones? what about the firmware update ?

I havent tried Farcry 5 or hitman but, SWBF1>Destiny 2>BF1 >ACrigins in terms of HDR.

Origins doesnt look washed out to be. In PS4 pro games the difference is huge and it only uses 4:2:2 or 4:2:0, not sure which.


----------



## kot0005

acmilangr said:


> I will try these settings even i think i have tried all combinations.
> 
> On nvidia panel if I choose 444 or 422 it has "limited" option. "full" option is only when i choose "RGB" is that nornal?



Why are you even using Nvidia color setting ? That could be your issue. Turn off HDR in windows and use default color settings in NvidiaCp. Only turn on HDR for media..


----------



## deadchip12

Local dimming test of PG27UQ. A lot of noise in the video so probably the iso is cranked up high to exaggerate the blooming for zone count. The transition from zone to zone *is surprisingly obvious; normally when I watch the zone count videos for fald tvs the blooming around the bright object will move smoothly but here the zone is lit then it's off then it's lit. Not sure which implementation is better:

*https://youtu.be/p8GjbIZ0Tvk


----------



## Malinkadink

deadchip12 said:


> Local dimming test of PG27UQ. A lot of noise in the video so probably the iso is cranked up high to exaggerate the blooming for zone count. The transition from zone to zone *is surprisingly obvious; normally when I watch the zone count videos for fald tvs the blooming around the bright object will move smoothly but here the zone is lit then it's off then it's lit. Not sure which implementation is better:
> 
> *https://youtu.be/p8GjbIZ0Tvk


TVs are far better with local dimming than monitors and for good reason. FALD has been around on TVs for a much longer time so they'd have a much better algorithm in place. I reckon Acer and Asus have far less experience dealing with FALD displays so they couldn't tune it to the same degree. Maybe if they doubled the zones their implementation would be as good or better than TVs with far fewer zones, but 384 zones is already a lot for a 27" display. Current LCD LED tech just needs to die already, miniLED is going to be next big thing im guessing until microLED is affordable and is offered in smaller sizes. OLED just doesn't seem like a good panel tech for monitors even though it offers superior picture quality, but microLED should be able to match it, or get close enough to not matter and no burn in issues.


----------



## Fraizer

last PG27UQ maximum refresh 98hz no 120 or 144... and No HDR...

Hello

i spend so much moneu on this monitor and now so much time without result...

i just bought this monitor but i dont know why i cant see the 120hz or 144 even i overcloked in the osd monitor....

- the maximum i can choose in 4k is 98 hz
- i dont see the HDR on windows where we can switched on or off


i have an nvidia titan X plug it with the display port provided with the the monitor (i tried another cable = same result). i am using the defaut windows setting in the nvidia control panel. everythings is by default...
I uninstall all nvidia driver and software in safe mode with last version of Display Driver Uninstaller... i am using windows 10 x64 last update version i have 8gb ddr 3 on asus maximus VII Gene cput 4790K

the monitor is defect ?

thank you for your support

my setup if you need it : 

Windows 10 Pro x64 (last updates, system clean without virus)

- Asus Maximus VII Gen Z97 chipset (last bios and drivers from asus)
- Intel 3790K
- 8GB DD3 Kingston
- 2 SSD samsung 840 Pro 256gb Raid o
- EVGA TITAN X (last firmware 1.0 for display port v1.4) (last nvidia drivers of today) we uninstall the old driver in windows safe mode with the software Display Driver Uninstaller 17.0.9.0
- Asus Phoebus sound card

We try it the monitor with another Display port cable 1.4 (an expensive one) and same result.. 98hz max)

i join some screenshots


----------



## kx11

try the NVIDIA_DisplayPort_Firmware_Updater_1.0-x64 on your screen 



also unplug then re-plug you DP cable


----------



## profundido

Fraizer said:


> last PG27UQ maximum refresh 98hz no 120 or 144... and No HDR...
> 
> Hello
> 
> i spend so much moneu on this monitor and now so much time without result...
> 
> i just bought this monitor but i dont know why i cant see the 120hz or 144 even i overcloked in the osd monitor....
> 
> - the maximum i can choose in 4k is 98 hz
> - i dont see the HDR on windows where we can switched on or off
> 
> 
> i have an nvidia titan X plug it with the display port provided with the the monitor (i tried another cable = same result). i am using the defaut windows setting in the nvidia control panel. everythings is by default...
> I uninstall all nvidia driver and software in safe mode with last version of Display Driver Uninstaller... i am using windows 10 x64 last update version i have 8gb ddr 3 on asus maximus VII Gene cput 4790K
> 
> the monitor is defect ?
> 
> thank you for your support
> 
> my setup if you need it :
> 
> Windows 10 Pro x64 (last updates, system clean without virus)
> 
> - Asus Maximus VII Gen Z97 chipset (last bios and drivers from asus)
> - Intel 3790K
> - 8GB DD3 Kingston
> - 2 SSD samsung 840 Pro 256gb Raid o
> - EVGA TITAN X (last firmware 1.0 for display port v1.4) (last nvidia drivers of today) we uninstall the old driver in windows safe mode with the software Display Driver Uninstaller 17.0.9.0
> - Asus Phoebus sound card
> 
> We try it the monitor with another Display port cable 1.4 (an expensive one) and same result.. 98hz max)
> 
> i join some screenshots



please try these things in this order:

-(Re)install the latest nvidia drivers with the 'custom' (not standard) installation option and select "remove all existing user settings" in order to remove any lingering conflicting settings.
-select "utilisez les parametres de couleurs nvidia", select RGB and 8-bit manually. Press 'Appliquer'. Now try to select 120hz again (and then 'appliquer' again)
-Verify that your card is running in x16 mode in nvidia control panel.
-verify that you not only are running latest windows updates but also latest major build (START-RUN-type "winver"). Should be build 1803
-Run the nvidia firmware update tool for your video card again to doublecheck if you firmware really cannot be updated any further (as suggested by the user above here)
-Reset your BIOS settings to factory default settings and start going through options that might affect the PCIE slot where the video card is in in any way so that maximum bandwith or functionality is limited
-remove the Asus soundcard from your computer and test again while this pcie slot is not in use

Let us know how it goes plz


----------



## deadchip12

Guys, check out this video (4:18 and 4:23 mark): 




How come the monitor shows true 10-bit at 98Hz while 8bit+dithering at 120hz? What is going on here?


----------



## Glerox

deadchip12 said:


> Guys, check out this video (4:18 and 4:23 mark): https://www.youtube.com/watch?v=Chc38IvnEjQ
> 
> How come the monitor shows true 10-bit at 98Hz while 8bit+dithering at 120hz? What is going on here?


What's wrong? That's how it's supposed to work.


----------



## deadchip12

Glerox said:


> deadchip12 said:
> 
> 
> 
> Guys, check out this video (4:18 and 4:23 mark): https://www.youtube.com/watch?v=Chc38IvnEjQ
> 
> How come the monitor shows true 10-bit at 98Hz while 8bit+dithering at 120hz? What is going on here?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's wrong? That's how it's supposed to work.
Click to expand...

Huh? This monitor can only do 8bit+FRC regardless of refresh rate


----------



## Glerox

deadchip12 said:


> Huh? This monitor can only do 8bit+FRC regardless of refresh rate


It doesn't mean that the monitor is displaying full 10bits, it only means that the signal sent by windows is 10bits and the dithering is done by the monitor.

When the signal is 8bits, the dithering is done at the driver/OS level so this is why you can't see a difference between 8bits and 10bits signal in HDR on this monitor and this is why there is no reason to run this monitor at 98Hz IMO.


----------



## Fraizer

kx11 said:


> try the NVIDIA_DisplayPort_Firmware_Updater_1.0-x64 on your screen
> 
> 
> 
> also unplug then re-plug you DP cable


hello

already apply this firmware and unplug replug my 2 different DP cables


----------



## Fraizer

profundido said:


> please try these things in this order:
> 
> -(Re)install the latest nvidia drivers with the 'custom' (not standard) installation option and select "remove all existing user settings" in order to remove any lingering conflicting settings.
> -select "utilisez les parametres de couleurs nvidia", select RGB and 8-bit manually. Press 'Appliquer'. Now try to select 120hz again (and then 'appliquer' again)
> -Verify that your card is running in x16 mode in nvidia control panel.
> -verify that you not only are running latest windows updates but also latest major build (START-RUN-type "winver"). Should be build 1803
> -Run the nvidia firmware update tool for your video card again to doublecheck if you firmware really cannot be updated any further (as suggested by the user above here)
> -Reset your BIOS settings to factory default settings and start going through options that might affect the PCIE slot where the video card is in in any way so that maximum bandwith or functionality is limited
> -remove the Asus soundcard from your computer and test again while this pcie slot is not in use
> 
> Let us know how it goes plz


hello

- i did the first step the second i dont see RGB on the list but is RVB not RGB then i select RVB and 8 bits but i dont see 120 even 100hz.
- to verify if is in x16 in the motherboard i select Gen 3 i think is the x16 but i dont relay see where to found this in cotnrol panel i gess information and yes like on the screen i join you on this post it show gen3 x16 if i am not wrong..
- yes the windows 10 is 1803 last updates
- yes already double check the nvidia firmware is applied for DP 1.4

i did all the steps you mention after unfortunatly is the same :/


----------



## profundido

Fraizer said:


> hello
> 
> - i did the first step the second i dont see RGB on the list but is RVB not RGB then i select RVB and 8 bits but i dont see 120 even 100hz.
> - to verify if is in x16 in the motherboard i select Gen 3 i think is the x16 but i dont relay see where to found this in cotnrol panel i gess information and yes like on the screen i join you on this post it show gen3 x16 if i am not wrong..
> - yes the windows 10 is 1803 last updates
> - yes already double check the nvidia firmware is applied for DP 1.4
> 
> i did all the steps you mention after unfortunatly is the same :/



hmm yes it says x16 I see it now in your screenshot

At this point I would use another (spare) system disk and do a clean install of the latest (US English version, not french !!) Windows build and test that. I cannot believe it's the monitor since the monitor's OSD simply detects and shows what signal is coming in from the pc. You could also hook it up to a friend's pc to doublecheck


----------



## Fraizer

you very kind profundido to help me

unfortunatly i dont have other disk to make a new installlation :/

to know why not french version ? i think noiw is just an language pack. in past with old windows it was not an language paks and in this case the system was less fast and with more bugs than a non english version. (just to know ^^)

profundido the white color on this monitor is normal is like yellow ? or lets say an warm white ? and the text like for exemple under eacch icons is not so clear is like blur.

except a new install of windows can i try something else (maybe something i have activate on the motherboard bios ?) because actualy i just cant make a new install on this computer :/


EDIT: is not a firmware update avaible ? and how i can check my firmware version ? i post you a screenshot where asus show a link to update the monitor firmware but the link dosent work


----------



## kx11

got it today , kinda weird using it after using Samsung CGH27 curved monitor 




HDR 10 looks good in FC5 but i had to put the nits to 120 in-game


----------



## deadchip12

kx11 said:


> got it today , kinda weird using it after using Samsung CGH27 curved monitor
> 
> 
> 
> 
> HDR 10 looks good in FC5 but i had to put the nits to 120 in-game


Can give some impressions? How's the blooming in dark games?


----------



## kx11

deadchip12 said:


> Can give some impressions? How's the blooming in dark games?





there's blooming for sure in SDR games like The Evil within 2 and tried FFXV in HDR in dark caves , there's also blooming here and there 





it doesn't bother me much ( not yet at least  )


----------



## bmgjet

Im in love with my PG27UQ.
Has that same feels as when I went from 1080p 75hz to 1440p 120hz. Was told I wouldnt notice much difference but side by side with my 1440p its a good ammount of difference. Im just running the pg27uq at 120hz since it feels a bit nicer on the eyes when working with both screens on.
For me totally worth the $3999 it cost locally.


----------



## deadchip12

bmgjet said:


> Im in love with my PG27UQ.
> Has that same feels as when I went from 1080p 75hz to 1440p 120hz. Was told I wouldnt notice much difference but side by side with my 1440p its a good ammount of difference. Im just running the pg27uq at 120hz since it feels a bit nicer on the eyes when working with both screens on.
> For me totally worth the $3999 it cost locally.


Holy ***** man where do you live? How come it costs twice of the US price?

How's HDR? Blooming? Some impressions pls


----------



## bmgjet

deadchip12 said:


> Holy ***** man where do you live? How come it costs twice of the US price?
> 
> How's HDR? Blooming? Some impressions pls



New Zealand.
Have to turn HDR off if the game doesnt support it or your using VLC.


----------



## profundido

Fraizer said:


> you very kind profundido to help me
> 
> unfortunatly i dont have other disk to make a new installlation :/
> 
> to know why not french version ? i think noiw is just an language pack. in past with old windows it was not an language paks and in this case the system was less fast and with more bugs than a non english version. (just to know ^^)
> 
> profundido the white color on this monitor is normal is like yellow ? or lets say an warm white ? and the text like for exemple under eacch icons is not so clear is like blur.
> 
> except a new install of windows can i try something else (maybe something i have activate on the motherboard bios ?) because actualy i just cant make a new install on this computer :/
> 
> 
> EDIT: is not a firmware update avaible ? and how i can check my firmware version ? i post you a screenshot where asus show a link to update the monitor firmware but the link dosent work


it happens that there are differences between windows builds and language-specific builds. The whole HDR functionality in windows is brandnew and still kind of in beta with settings both for windows and the nvidia drivers being changed and added. Therefore you need to exclude the possibility that it's software related all together. The best way is make a 'reference' installation of which it is known to work so you can exclude area's where the problem is for sure not. Reference windows build is US-English latest build downloaded from Microsoft's ISO maker download tool or equivalent. Then update and install latest US-english nvidia drivers and test. Don't install a language pack until after your tests as it may break the working reference installation. It might even be the cause of your problem

alternatively take your screen up to your friend who has a 1080ti or equivalent and test it there.

the 'normal' setting is indeed kind yellow (to protect your eyes) but you can easily override by chooseing "user defined" and adjust the RGB colors to your personal liking. I did that too

A firmware update for these monitors does not yet exist but they're working on it (confirmed) and it will become available later this year. Not soon though.

best for now to exclude the software and your computer hardware is drive your screen to a friend or computer shop and have it tested on another machine to determine it's not the monitor right now. Once you know that for sure you can continue troubleshooting software or hardware until your get closer to the cause. (consider buying a cheap extra system disk to temporarily use for testing new/different OS versions or troubleshooting in general etc)


----------



## fleggy

Hello, I have a strange problem. If I set desktop to 98Hz, 10bit color, SDR then all my old FHD and QHD resolution games (fullscreen mode used) are in a centered box regardless the GPU/monitor scaling setting. If I set 98Hz, 8bit color, SDR in desktop settings then FHD and QHD resolution games are scaled properly. I have no other 10bit color capable monitor/TV to check my suspicion that the color depth is the culprit. Or maybe it is "a feature" of PG27UQ.
Shortly - 10bit color desktop + 8bit color game with lower resoluton (fullscreen mode) -> image is not scaled to fullcreen. Could someone else reproduce it? Thanks

Windows 10 Pro 64bit 1803 build 17134.165
1080ti (DP firmware updated)
driver 398.36
games tested: ARMA 2, Witcher 3, Kingdome Come Deliverance, etc...


----------



## profundido

fleggy said:


> Hello, I have a strange problem. If I set desktop to 98Hz, 10bit color, SDR then all my old FHD and QHD resolution games (fullscreen mode used) are in a centered box regardless the GPU/monitor scaling setting. If I set 98Hz, 8bit color, SDR in desktop settings then FHD and QHD resolution games are scaled properly. I have no other 10bit color capable monitor/TV to check my suspicion that the color depth is the culprit. Or maybe it is "a feature" of PG27UQ.
> Shortly - 10bit color desktop + 8bit color game with lower resoluton (fullscreen mode) -> image is not scaled to fullcreen. Could someone else reproduce it? Thanks
> 
> Windows 10 Pro 64bit 1803 build 17134.165
> 1080ti (DP firmware updated)
> driver 398.36
> games tested: ARMA 2, Witcher 3, Kingdome Come Deliverance, etc...



hey, I noticed that too when Starcraft (a very old game) was still set to a lower than 4K resolution. While it didn't scale automatically (using factory default settings) I assumed it wasn't worth investigating at that time. I solved it by setting the correct 4K resolution in the game which is the best ofc but I get that maybe some of your older games simply don't support 4K and you a forced to revert to scaling. I didn't investigate this (yet) in the end but I can tell you where to look in order to solve it. You need try different settings in 2 different places and see how the combination of them behaves in regard to this:

1. in the monitor OSD "aspect ratio" vs "no scaling" or many even more options

2. In the nvidia control panel 

http://i.imgur.com/Z74V3qc.png

Logically a combination of something named "fullscreen" or "stretching/scaling" should be the options you are looking for.

Then finally we must consider the option that possibly no combination will cause the monitor to trigger and that you might have to wait for the new firmware later this year or future nvidia driver updates. In that case this feature would indeed seem 'broken' for now


----------



## fleggy

I think I tried all combinations. There is no scaling problem when I set 8bit desktop. Just 10bit desktop somehow disables image scaling.
Anyway I am glad that you had the same issue


----------



## deadchip12

HDR reduces framerate?https://www.computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/


----------



## kx11

just tested Planter Earth 4k HDR using MPC-HC MADvr , once i open the file the monitor turns HDR on which's awesome , amazing PQ 





















took this shot using my phone with HDR photo mode on


----------



## Fraizer

profundido said:


> hey, I noticed that too when Starcraft (a very old game) was still set to a lower than 4K resolution. While it didn't scale automatically (using factory default settings) I assumed it wasn't worth investigating at that time. I solved it by setting the correct 4K resolution in the game which is the best ofc but I get that maybe some of your older games simply don't support 4K and you a forced to revert to scaling. I didn't investigate this (yet) in the end but I can tell you where to look in order to solve it. You need try different settings in 2 different places and see how the combination of them behaves in regard to this:
> 
> 1. in the monitor OSD "aspect ratio" vs "no scaling" or many even more options
> 
> 2. In the nvidia control panel
> 
> http://i.imgur.com/Z74V3qc.png
> 
> Logically a combination of something named "fullscreen" or "stretching/scaling" should be the options you are looking for.
> 
> Then finally we must consider the option that possibly no combination will cause the monitor to trigger and that you might have to wait for the new firmware later this year or future nvidia driver updates. In that case this feature would indeed seem 'broken' for now



thank you 

unfotunatly i am the only gamer ^^ aqnd shop in ^paris.. they dont accept to do that when you dont bought from them the monitor... :/


----------



## Fraizer

oh beautiful !!

can you share your complete setting please ? 





kx11 said:


> just tested Planter Earth 4k HDR using MPC-HC MADvr , once i open the file the monitor turns HDR on which's awesome , amazing PQ
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> took this shot using my phone with HDR photo mode on


----------



## kx11

Fraizer said:


> oh beautiful !!
> 
> can you share your complete setting please ?





sure


----------



## bmgjet

deadchip12 said:


> HDR reduces framerate?https://www.computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/


I can confirm that there is a 15fps drop with HDR on BF1 with 1080ti's.
It looks so nice with it on but just cant take that FPS hit since it drops me into 80fps areas under heavy action and I didnt build this rig to drop quality settings.


Decided to give the 144hz mode a try. Is it ment to only show as 143hz.
GPUs never drop out of high performance state with it on and end up with 1560mhz idle clock (102W idle). On 120hz was atleast dropping back to 866mhz (63W idle)

Also every where Iv read its ment to drop back to 8bit mode. Mine just always says 10bit when I have my DVI screens overclock on.
Maybe a pixel clock thing since im running a 480mhz OC pixel clock for DVI-D to work at 120hz.


----------



## deadchip12

bmgjet said:


> deadchip12 said:
> 
> 
> 
> HDR reduces framerate?https://www.computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/
> 
> 
> 
> I can confirm that there is a 15fps drop with HDR on BF1 with 1080ti's.
> It looks so nice with it on but just cant take that FPS hit since it drops me into 80fps areas under heavy action and I didnt build this rig to drop quality setting.
Click to expand...

***. HDR is supposed to have negligible impact on performance.

Maybe there are some issues with nvidia driver, since based on the chart amd is not impacted. Is your gpu driver the latest?


----------



## bmgjet

deadchip12 said:


> ***. HDR is supposed to have negligible impact on performance.
> 
> Maybe there are some issues with nvidia driver, since based on the chart amd is not impacted. Is your gpu driver the latest?


Yup, have the latest.


----------



## CallsignVega

Their tests show FPS loss not because of HDR, but because they are using reduced chroma (4:2:2) in their HDR tests and RGB in their SDR test. Reducing the chroma causes the performance loss (overhead), not HDR.


----------



## deadchip12

CallsignVega said:


> Their tests show FPS loss not because of HDR, but because they are using reduced chroma (4:2:2) in their HDR tests and RGB in their SDR test. Reducing the chroma causes the performance loss (overhead), not HDR.


Hmm why does reduced chroma cause fps loss?


----------



## Morkai

CallsignVega said:


> Their tests show FPS loss not because of HDR, but because they are using reduced chroma (4:2:2) in their HDR tests and RGB in their SDR test. Reducing the chroma causes the performance loss (overhead), not HDR.


I suspect they render in full 4:4:4 and then apply a lossless version of the same encodning shadowplay uses, instead of rendering directly in 4:2:2.
That would be perfectly in line with the performance loss numbers you achieved in benchmarks (5-7%)?


----------



## hnizdo

deadchip12 said:


> HDR reduces framerate?https://www.computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/


[email protected] RGB444, Zotac 1080 (+150core - 1936 MHz stable, +400ram)
AC: Origins (4k, max settings, no AA): HDR ON: 42fps, HDR off: 43fps
Destiny2: (4k, max settings, no blur, no grain, no AA): HDR ON: 50fps, HDR off: 55fps


----------



## kx11

i wonder if you guys feels the colors are a little blown out in the apps icons like firefox/Chrome icons got the red color so strong it looks blown out like MS paint red


----------



## Glerox

Does somebody has a how-to guide on how to remove the AG filter? I'm thinking more and more about removing it. I asked Blackvette but no answers yet.


----------



## KGPrime

https://hardforum.com/threads/guide...e-ag-coating-from-a-dell-u2312hm-lcd.1674033/

Same basic thing, though tearing it down will surely be different. Make sure to get a large basket for them big ass balls if you are doing it for the first time on a 2k dollar monitor. Maybe try it on a test subject first, non working or cheap lcd.


----------



## Glerox

KGPrime said:


> https://hardforum.com/threads/guide...e-ag-coating-from-a-dell-u2312hm-lcd.1674033/
> 
> Same basic thing, though tearing it down will surely be different. Make sure to get a large basket for them big ass balls if you are doing it for the first time on a 2k dollar monitor. Maybe try it on a test subject first, non working or cheap lcd.


lol!! thanks it will take a basket for sure.


----------



## Bloodmosher

Bloodmosher said:


> Well, mine went back today. I'm going to try another unit. The white consistency problem on the desktop was driving me mad. Top half of email/web pages were white, lower half noticeably yellow. For 2K I should be able to read email as well as play HDR games.


While I'm waiting for the next PG27UQ I decided to try the Predator X27. So far some interesting results:
1. It has the same issue with white consistency on the desktop as the PG27UQ, but it is a little less noticeable. To see this, take an app like chrome with a view of something like a slashdot comment thread or excel with an empty workbook or outlook with rows of email subjects and make it half the height of the screen. Drag it from the top to the bottom of the monitor. As you go toward the bottom notice that the white becomes more yellow-ish as you enter the lower half of the panel.
2. I found that I was able to more closely match the desktop white balance and brightness of my other two monitors (two PG27AQs) than the PG27UQ was. However I still need to enable HDR to achieve the same white brightness as the other two monitors. The current Chrome beta solves the issue of Chrome not working in HDR so I'm happy with running HDR on the desktop.
3. The joystick is in a better position for me than the Asus; it is closer to the bottom and since I have a triple monitor display, I need to reach from the bottom to make adjustments.
4. I think I prefer the ambient lighting on the Acer... this isn't a big selling point for me on either but...
5. It was MUCH simpler to remove the mount/arm and put it on a custom mount than the PG27UQ - much simpler to remove the plastic.
6. Both have some issues that I occasionally notice on the desktop; flashlight effect, light bleed showing yellow on the edge of windows, etc. 

So far I am inclined to stick with the X27 but I intend to try one more Asus before deciding.


----------



## acmilangr

Glerox said:


> Does somebody has a how-to guide on how to remove the AG filter? I'm thinking more and more about removing it. I asked Blackvette but no answers yet.


I did. But not recommend it. Polarizer on this monitor is more sensitive.

After removing the antiglare coating you need NEVER Clear the screen. So it needs some protection film to be installed. 

But the monitor is much more better as glossy. Better Black. More Clear and vivid colors.


----------



## deadchip12

Guys, this chinese site says if we buy PG27UQ before Aug 8th, we get a free PS4 Pro?
https://ccc.technews.tw/2018/07/23/asus-rog-swift-pg27uq-ps4-pro-for-free/


----------



## kot0005

hnizdo said:


> [email protected] RGB444, Zotac 1080 (+150core - 1936 MHz stable, +400ram)
> AC: Origins (4k, max settings, no AA): HDR ON: 42fps, HDR off: 43fps
> Destiny2: (4k, max settings, no blur, no grain, no AA): HDR ON: 50fps, HDR off: 55fps


He was using SLI, 2 1080Ti's I have a 1080Ti and the performance drop is negligible with HDR off/on its 1-5fps. Within margin of error.


----------



## deadchip12

Seems like microLED TVs are rolling out next year. Fck! TV tech moves so fast. And to think that miniled monitor is not even near us yet.  https://www.zdnet.com/article/samsung-to-launch-30-millimetre-thick-luxury-microled-tv-next-year/


----------



## Malinkadink

deadchip12 said:


> Seems like microLED TVs are rolling out next year. Fck! TV tech moves so fast. And to think that miniled monitor is not even near us yet.  https://www.zdnet.com/article/samsung-to-launch-30-millimetre-thick-luxury-microled-tv-next-year/


microLED > miniLED in case that part isn't clear. It's actually really silly that we have microATX cases that are larger than the miniITX cases even though micro is a smaller unit than mini.

Anyways, it really is sickening how much faster TVs move along compared to monitors, but i guess its all because the majority of sales are indeed TVs, whereas monitors make up a very small amount of overall sales volume. 

Due to this most of the time a TV is a better value as well compared to a monitor at a similar price. It's the same thing with cars, high volume cars are sold at a cheaper price compared to a low volume vehicle where you'll generally pay more.


----------



## acmilangr

Just to know.a small monitor is much more difficult to be manufactured than TV.

Everything must become smaller. Pixels, FALD, Everything.


----------



## Ford8484

I said this in the X27 thread as well....anyone new to this monitor check out Ni No Kuni 2 if you can find a good deal for it- even if your not into JRPGS- The HDR in this game is simply incredible.


----------



## kx11

Ford8484 said:


> I said this in the X27 thread as well....anyone new to this monitor check out Ni No Kuni 2 if you can find a good deal for it- even if your not into JRPGS- The HDR in this game is simply incredible.



it is incredible when the magic attack effects are on , truly amazing 



the 1st boss battle in the sewers is truly amazing


----------



## kot0005

Linus says its true 10Bit ? also specifies that it doesnt use diteathering.

I feel like I am living in an illusion now..


----------



## CallsignVega

deadchip12 said:


> Seems like microLED TVs are rolling out next year. Fck! TV tech moves so fast. And to think that miniled monitor is not even near us yet.  https://www.zdnet.com/article/samsung-to-launch-30-millimetre-thick-luxury-microled-tv-next-year/


LOL. There is a reason it is a ridiculous 146 inches. That's how small they can make the LED's at this time. And for six figures...

microLED TV's are 5+ years away, maybe 8+ years to be affordable at consumer prices. microLED computer monitor sizes and resolution may NEVER happen. 



kot0005 said:


> Linus says its true 10Bit ? also specifies that it doesnt use diteathering.
> 
> I feel like I am living in an illusion now..


LOL are you getting your specs from Linus?? The panel is 10 bit dithered, which means 8 bit + FRC. That has been known for quite some time. It's even on the ASUS spec sheet.


----------



## deadchip12

CallsignVega said:


> LOL are you getting your specs from Linus?? The panel is 10 bit dithered, which means 8 bit + FRC. That has been known for quite some time. It's even on the ASUS spec sheet.


Actually on asus spec site it says *Display Colors : 1.07b (10 bit with dithering)*. Shouldn't it be *Display Colors : 1.07b (8 bit with dithering)*?


----------



## CallsignVega

No, because 10 bit with dithering means it is 8-bit + FRC. 8 bit with dithering means it's 6 bit + FRC. 

http://dlcdnet.asus.com/pub/ASUS/LCD Monitors/PG27V/ROGSwift_PG27U_English.pdf

Page 3-8, 16.7 million colors = 8-bit.


----------



## kx11

one of the things i like about this monitor is when i play a game at a resolution lower than 4k it doesn't looks blurry at all


----------



## kot0005

CallsignVega said:


> LOL. There is a reason it is a ridiculous 146 inches. That's how small they can make the LED's at this time. And for six figures...
> 
> microLED TV's are 5+ years away, maybe 8+ years to be affordable at consumer prices. microLED computer monitor sizes and resolution may NEVER happen.
> 
> 
> 
> LOL are you getting your specs from Linus?? The panel is 10 bit dithered, which means 8 bit + FRC. That has been known for quite some time. It's even on the ASUS spec sheet.


That Tv is guna cost a lot, Probably as much as a really good sports car. IO am surprised that they r even mass producing these..These customers r guna be beta testing them and probably with dead microled's ?


----------



## animeowns

started playing with my rog pg27uq I am happy to report I have no dead pixels with variable backlight on or off and no backlight bleed or haloing and for the fan in the monitor my pc is louder I don't hear the fan at all but my model does have displayport sleep turned on by default I am keeping my panel and getting an extended warranty on it 5+ years. I got very lucky


----------



## acmilangr

animeowns said:


> started playing with my rog pg27uq I am happy to report I have no dead pixels with variable backlight on or off and no backlight bleed or haloing and for the fan in the monitor my pc is louder I don't hear the fan at all but my model does have displayport sleep turned on by default I am keeping my panel and getting an extended warranty on it 5+ years. I got very lucky


There is no way you dont have haloing.


----------



## Ford8484

kx11 said:


> one of the things i like about this monitor is when i play a game at a resolution lower than 4k it doesn't looks blurry at all


Yea I noticed that too. The upscaling works really well on this monitor- there are some pros to having "only" a 27 inch 4k screen. Makes the monitor more versatile in some ways.


----------



## kx11

does this monitor have DP 1.4 ? if yes why can't it run true 10bit HDR ?!


----------



## acmilangr

kx11 said:


> does this monitor have DP 1.4 ? if yes why can't it run true 10bit HDR ?!


It is panel, Not bandwidth limitation


----------



## Malinkadink

acmilangr said:


> It is panel, Not bandwidth limitation


While the panel is indeed 8 bit + FRC it is also bandwidth limited which is why it cannot run in 10 bit mode HDR above 98hz if you want 4:4:4. 4:2:2 120hz 10 bit works, but 144hz does not as they didn't implement DSC for this monitor. 2nd gen monitors should be better, but the price they're charging for these is really off putting. I like the mango monitor being sold for $1300ish but its a risky purchase plus 43 inches is too big for me, 32 inch would be nice. Ideally a 4k 144hz 32 inch monitor for $1,000 would be really nice, but we won't see that price and spec monitor at least with gsync for a looooooong time.


----------



## kx11

btw my unit doesn't have a loud fan nor noisy where it's noticeable


----------



## animeowns

acmilangr said:


> There is no way you dont have haloing.


I don't maybe I got lucky I got my model from microcenter online for $1870


----------



## deadchip12

animeowns said:


> I don't maybe I got lucky I got my model from microcenter online for $1870


Haloing is inherent to local dimming tech. You will have it, but probably it's subtle you don't notice it, which is a good thing. May be try the LG Chess Demo in a pitch black room and see if you can see the haloing.


----------



## acmilangr

TFT central got the monitor for review and it is with the latest firmware update


----------



## Ford8484

acmilangr said:


> TFT central got the monitor for review and it is with the latest firmware update


I wonder why the Asus monitor gets more attention overall. There's a ton of reviews for it and not the Acer....better marketing perhaps? There identical except for the build and aesthetics. Personally I think the Acer looks better because of smaller bezels....each their own though


----------



## animeowns

deadchip12 said:


> Haloing is inherent to local dimming tech. You will have it, but probably it's subtle you don't notice it, which is a good thing. May be try the LG Chess Demo in a pitch black room and see if you can see the haloing.


My initial testing was done at night in a pitch black room but I didn't try the chess demo its bubba btw from ROG Forums deadchip I'll give it a go tonight.


----------



## animeowns

Ford8484 said:


> I wonder why the Asus monitor gets more attention overall. There's a ton of reviews for it and not the Acer....better marketing perhaps? There identical except for the build and aesthetics. Personally I think the Acer looks better because of smaller bezels....each their own though


its possible because it has the vesa certification and the Acer model doesn't


----------



## Glerox

Ford8484 said:


> I wonder why the Asus monitor gets more attention overall. There's a ton of reviews for it and not the Acer....better marketing perhaps? There identical except for the build and aesthetics. Personally I think the Acer looks better because of smaller bezels....each their own though


Yup the ROG marketing is way better. They announced it a lot more. Plus the certifications.


----------



## Ford8484

animeowns said:


> its possible because it has the vesa certification and the Acer model doesn't


Yep- Marketing. Its an identical panel- same picture quality. https://displayhdr.org/certified-products/


----------



## drfouad

RGB Galore, please enough of the RGB [email protected]# with everything..
Pretty soon we will have RGB lights on PSU, MOuse cords etc.


----------



## KGPrime

drfouad said:


> RGB Galore, please enough of the RGB [email protected]# with everything..
> Pretty soon we will have RGB lights on PSU, MOuse cords etc.


waddya mean pretty soon...


----------



## bmgjet

Ford8484 said:


> I wonder why the Asus monitor gets more attention overall. There's a ton of reviews for it and not the Acer....better marketing perhaps? There identical except for the build and aesthetics. Personally I think the Acer looks better because of smaller bezels....each their own though


Acer isnt even released in my country yet. Still 2 months away and extra $500 ontop of the Asus $3999 price tag.


----------



## kot0005

Was really looking forward to Monster hunter world. Apparently it can only do 45fps at 1440p with GTX1070..that sounds so bad.


----------



## deadchip12

kot0005 said:


> Was really looking forward to Monster hunter world. Apparently it can only do 45fps at 1440p with GTX1070..that sounds so bad.


Not 1070. It's below 60 fps at 1440p with gtx 1080 I think

Game seems to be terribly optimized. Also have denuvo on top making the problem worse.


----------



## kx11

kot0005 said:


> Was really looking forward to Monster hunter world. Apparently it can only do 45fps at 1440p with GTX1070..that sounds so bad.



yeah , games running slow on a not-gameready driver is a bad thing


----------



## Ford8484

The game doesnt even look that good....it looks ok, but that framerate with a 1080 is crazy. I wonder how the 1080ti will do with 4k.....Prob just run it at high settings and maybe downscale the resolution a bit....at least its uncapped framerate.


----------



## Malinkadink

Fact that the game is even being released on PC is a good thing, performance fixes will come, just glad a proper MH game on PC is here.


----------



## fleggy

Hello everybody, now I have another problem - the Variable Backlight is somehow broken after leaving certain games. There are areas with cyan tint on desktop and they change as the mouse (or other object) moves. E.g. MGS V The Phantom Pain always invokes this issue. Just load your saved position and return back to desktop. The game and desktop are set to 4K. No HDR, just SDR 8bit colors. The only solution (AFAIK) is to unplug/plug the unit (the cable, not the power button). Could someone try it, please? Thanks

EDIT: I've just noticed that when the unit is in this strange state then I cannot turn HDR ON in Display Settings (the switch is present). And the unit is limited to 82Hz (only 24, 30, 60 and 82Hz available in nvidia CP). Obviously bugged firmware


----------



## animeowns

kot0005 said:


> Was really looking forward to Monster hunter world. Apparently it can only do 45fps at 1440p with GTX1070..that sounds so bad.


# done gaming until the new video cards release gtx 11 series


----------



## kot0005

kx11 said:


> yeah , games running slow on a not-gameready driver is a bad thing


You dont even know if Nvidia will be releasing gameready drivers for this game. This game should be running at 100+ fps at 1080p on a GTX1070.


But right now its only doing 60fps. Will wait for magical 40fps increase with drivers I guess.

Did anyone Try Shadow of war ? It has HDR. Works great on ps4 pro.


----------



## kx11

kot0005 said:


> You dont even know if Nvidia will be releasing gameready drivers for this game. This game should be running at 100+ fps at 1080p on a GTX1070.
> 
> 
> But right now its only doing 60fps. Will wait for magical 40fps increase with drivers I guess.
> 
> Did anyone Try Shadow of war ? It has HDR. Works great on ps4 pro.





i'm welling to bet they will release game-ready drivers for this game 




EDIT : i tried SOW HDR and it looks very good


----------



## deadchip12

Guys. I promised to do a review of the PG27UQ versus my OLED C7 but there's a problem. The PG27UQ is about to arrive at my country at a price tag of $3000. *3 THOUSAND FREAKING DOLLARS!!!!* To put this in perspective, it's 50% increase compared to the price in the US and more than twice the price I paid for the OLED C7.

Despite how much I want to own this god tier 384-zone FALD 4k 144Hz Gsync HDR monitor, sadly there's a limit to how much I'm willing to open my wallet. Maybe I would pay $3000 for a no compromise Dolby Vision MicroLed monitor, but for this monitor, $2000 is already a stretch. I think I'm gonna have to settle for a TV, though I really don't want to due to input lag (I'm a PC gamer), tearing and the big TV size hurts my head. 

Any recommendations for a 55 inch 4K HDR TV that costs less than $2000, ideally $1500? I have had my eyes on the Sony X930E for a long time (price is 1400 usd here) but the measly 60 dimming zones worry me a bit. Don't want to game on OLED because of burn in.


----------



## Fanu

deadchip12 said:


> Any recommendations for a 55 inch 4K HDR TV that costs less than $2000, ideally $1500? I have had my eyes on the Sony X930E for a long time (price is 1400 usd here) but the measly 60 dimming zones worry me a bit. Don't want to game on OLED because of burn in.


famous quote that applies in this case: 
"the technology just isnt there yet" 

gaming monitors suck due to image quality issues
TVs suck for fast gaming due to bandwidth restrictions + no adaptive sync (yea its coming..)

you either wait and see what gets released on the market in next year or more
or buy whatever is available now and compromise on certain things

if I were you and had money to spend, I would just buy a nice 34 or 38" ultrawide with >60Hz refresh rate and not think about what could have been.. you'll still get immersive gaming experience at a decent picture quality instead of paying thru your ass for HDR that is a ***** to set up


----------



## profundido

fleggy said:


> Hello everybody, now I have another problem - the Variable Backlight is somehow broken after leaving certain games. There are areas with cyan tint on desktop and they change as the mouse (or other object) moves. E.g. MGS V The Phantom Pain always invokes this issue. Just load your saved position and return back to desktop. The game and desktop are set to 4K. No HDR, just SDR 8bit colors. The only solution (AFAIK) is to unplug/plug the unit (the cable, not the power button). Could someone try it, please? Thanks
> 
> EDIT: I've just noticed that when the unit is in this strange state then I cannot turn HDR ON in Display Settings (the switch is present). And the unit is limited to 82Hz (only 24, 30, 60 and 82Hz available in nvidia CP). Obviously bugged firmware


Had it too. Whenever you have this buggy state, switch to a different FALD mode in the OSD of your monitor to fix it


----------



## fleggy

profundido said:


> Had it too. Whenever you have this buggy state, switch to a different FALD mode in the OSD of your monitor to fix it


Unfortunately as soon as I switch FALD back to Fast the problem returns. Next time I'll try to toggle overclock (120/144). I think it could help.
BTW what card do you have?


----------



## kot0005

I just cant believe Deadchip12 is still on this thread asking(trolling ??) more Questions and about TV's in a thread that specifies PG27UQ..


----------



## kot0005

fleggy said:


> Unfortunately as soon as I switch FALD back to Fast the problem returns. Next time I'll try to toggle overclock (120/144). I think it could help.
> BTW what card do you have?


is this issue on 120/144hz ?


----------



## profundido

fleggy said:


> Unfortunately as soon as I switch FALD back to Fast the problem returns. Next time I'll try to toggle overclock (120/144). I think it could help.
> BTW what card do you have?


Titan XP here (latest firmware). nvidia control panel set to 8bit RGB 120hz


----------



## fleggy

1080ti, 8-bit in nvcp, SDR, 120Hz


----------



## pez

deadchip12 said:


> Guys. I promised to do a review of the PG27UQ versus my OLED C7 but there's a problem. The PG27UQ is about to arrive at my country at a price tag of $3000. *3 THOUSAND FREAKING DOLLARS!!!!* To put this in perspective, it's 50% increase compared to the price in the US and more than twice the price I paid for the OLED C7.
> 
> Despite how much I want to own this god tier 384-zone FALD 4k 144Hz Gsync HDR monitor, sadly there's a limit to how much I'm willing to open my wallet. Maybe I would pay $3000 for a no compromise Dolby Vision MicroLed monitor, but for this monitor, $2000 is already a stretch. I think I'm gonna have to settle for a TV, though I really don't want to due to input lag (I'm a PC gamer), tearing and the big TV size hurts my head.
> 
> Any recommendations for a 55 inch 4K HDR TV that costs less than $2000, ideally $1500? I have had my eyes on the Sony X930E for a long time (price is 1400 usd here) but the measly 60 dimming zones worry me a bit. Don't want to game on OLED because of burn in.


I have the US version of that TV...it rates at a higher input latency than most Samsung models. It's not unbearable, and fine for me for console gaming, but if you're sensitive to your C7 input latency, the Sony might be, too. Also, go for the X930F (or whatever the 2018 is named in your locale)--supposedly a better CPU for the TV's base OS (Android TV) and has Dolby support.

There are some Samsung models with Freesync now, but...yeah....this is mostly a discussion for a different thread.


----------



## Morkai

deadchip12 said:


> Guys. I promised to do a review of the PG27UQ versus my OLED C7 but there's a problem. The PG27UQ is about to arrive at my country at a price tag of $3000. *3 THOUSAND FREAKING DOLLARS!!!!* To put this in perspective, it's 50% increase compared to the price in the US and more than twice the price I paid for the OLED C7.


The usa lists prices without tax and your country most likely lists it including tax?
But yes, for some reason it costs about 25% more in the eu than the currency conversion suggests it should, and no shop seems to drop the price (meaning asus/acer distributors probably just set a higher price for eu for whatever reason).
I paid $3200 (minus some 4% discount plus some cashback points in norway - deducting the points and discount it was about $2850).


----------



## deadchip12

Morkai said:


> deadchip12 said:
> 
> 
> 
> Guys. I promised to do a review of the PG27UQ versus my OLED C7 but there's a problem. The PG27UQ is about to arrive at my country at a price tag of $3000. *3 THOUSAND FREAKING DOLLARS!!!!* To put this in perspective, it's 50% increase compared to the price in the US and more than twice the price I paid for the OLED C7.
> 
> 
> 
> The usa lists prices without tax and your country most likely lists it including tax?
> But yes, for some reason it costs about 25% more in the eu than the currency conversion suggests it should, and no shop seems to drop the price (meaning asus/acer distributors probably just set a higher price for eu for whatever reason).
> I paid $3200 (minus some 4% discount plus some cashback points in norway - deducting the points and discount it was about $2850).
Click to expand...

Yes that includes custom tax and retail markup. Normally it's only around 25% higher than us price but for this monitor that suddenly jumps to 50%, and only very limited amount is available.

Meanwhile, tvs here are normally at the same price as us, or even cheaper. My oled c7 costs $1200, probably because they are clearing warehouses for the next model. That drop in price never happens to high end monitors and pc components in general.


----------



## kx11

delete


----------



## deadchip12

Sorry can someone tell me what is the max brightness in SDR mode with FALD turned on?


----------



## l88bastar

deadchip12 said:


> Sorry can someone tell me what is the max brightness in SDR mode with FALD turned on?


300


----------



## deadchip12

l88bastar said:


> 300


Thanks. How do you enjoy the monitor so far? I'm waiting for tftcentral review to decide whether to finally pull the trigger and empty my bank account


----------



## Bloodmosher

Bloodmosher said:


> While I'm waiting for the next PG27UQ I decided to try the Predator X27. So far some interesting results:
> 1. It has the same issue with white consistency on the desktop as the PG27UQ, but it is a little less noticeable. To see this, take an app like chrome with a view of something like a slashdot comment thread or excel with an empty workbook or outlook with rows of email subjects and make it half the height of the screen. Drag it from the top to the bottom of the monitor. As you go toward the bottom notice that the white becomes more yellow-ish as you enter the lower half of the panel.
> 2. I found that I was able to more closely match the desktop white balance and brightness of my other two monitors (two PG27AQs) than the PG27UQ was. However I still need to enable HDR to achieve the same white brightness as the other two monitors. The current Chrome beta solves the issue of Chrome not working in HDR so I'm happy with running HDR on the desktop.
> 3. The joystick is in a better position for me than the Asus; it is closer to the bottom and since I have a triple monitor display, I need to reach from the bottom to make adjustments.
> 4. I think I prefer the ambient lighting on the Acer... this isn't a big selling point for me on either but...
> 5. It was MUCH simpler to remove the mount/arm and put it on a custom mount than the PG27UQ - much simpler to remove the plastic.
> 6. Both have some issues that I occasionally notice on the desktop; flashlight effect, light bleed showing yellow on the edge of windows, etc.
> 
> So far I am inclined to stick with the X27 but I intend to try one more Asus before deciding.


Wow what a difference! I received my 2nd PG27UQ today. No issues with white color consistency on the desktop and I can easily match the white of my PG27AQ in SDR. So far so good, and it looks like i'll be returning the X27 in favor of this one.


----------



## l88bastar

deadchip12 said:


> Thanks. How do you enjoy the monitor so far? I'm waiting for tftcentral review to decide whether to finally pull the trigger and empty my bank account


----------



## profundido

deadchip12 said:


> Thanks. How do you enjoy the monitor so far? I'm waiting for tftcentral review to decide whether to finally pull the trigger and empty my bank account


very happy here with my X27 so far. Love the style, the stand and have zero issues with anything. White can be matched perfectly to my previous Asus PG27AQ. I Also use this monitor as a tv to view Netflix and sortalikes. I tried such movie content on my previous IPS monitor again and noticed I suddenly couldn't stand the glowing on the black parts and movie bands anymore. FALD really is an improvement for me in movie content as well. The only thing I do want is a better future graphics card for 4K games but hey the gtx1180 is coming soon


----------



## stefxyz

Quick question. Can these monitors with FALD be calibrated with an xrite pro like every other monitor?


----------



## deadchip12

stefxyz said:


> Quick question. Can these monitors with FALD be calibrated with an xrite pro like every other monitor?


Thought these monitor are pre calibrated by asus before shipping out?


----------



## stefxyz

Even the best monitors change color over time why you have to recalibrate a monitor from time to time. For very critical tasks people calibrate every 2 weeks. Mostyl its enough 3 to 4 times a year.


----------



## deadchip12

stefxyz said:


> Even the best monitors change color over time why you have to recalibrate a monitor from time to time. For very critical tasks people calibrate every 2 weeks. Mostyl its enough 3 to 4 times a year.


Hmm I see. Do you think periodical calibration is very necessary if all I do is gaming and watching movies? I honestly don't want to spend time or money calibrating or anything


----------



## Ford8484

deadchip12 said:


> Thanks. How do you enjoy the monitor so far? I'm waiting for tftcentral review to decide whether to finally pull the trigger and empty my bank account


I have the X27 and its fantastic. Don't be put off by the "small" size either- 27 inches is more then adequate. There's virtually no IPS glow when FALD is enabled- the differences to previous panels is pretty huge there. Honestly the only issue- if you want to call it that- is the lack of GPU power for certain games...but obviously that will come with time. Also, unless your super OCD with it or only play on OLED contrast levels- the bloom is there- BUT, its negligible. I only really noticed it in RE7- but the trade off for the HDR in that game is better.


----------



## Malinkadink

deadchip12 said:


> Hmm I see. Do you think periodical calibration is very necessary if all I do is gaming and watching movies? I honestly don't want to spend time or money calibrating or anything


Most people who buy a TV and get it calibrated keep it that way for the rest of its life that they keep it which is at least a few years. The amount of drift is pretty minimal even longer periods of time especially on an LCD so if you're just using a display for entertainment then its fine to calibrate once and be done with it. You'll also probably be running a brightness level that is suited for dark room and lower brightness levels will prolong the monitors life and not affect a calibration as much over time too.


----------



## deadchip12

Ford8484 said:


> I have the X27 and its fantastic. Don't be put off by the "small" size either- 27 inches is more then adequate. There's virtually no IPS glow when FALD is enabled- the differences to previous panels is pretty huge there. Honestly the only issue- if you want to call it that- is the lack of GPU power for certain games...but obviously that will come with time. Also, unless your super OCD with it or only play on OLED contrast levels- the bloom is there- BUT, its negligible. I only really noticed it in RE7- but the trade off for the HDR in that game is better.


I am definitely not put off by the 27 inch size. I love gaming on a monitor way more than a big TV. But hearing some people say 384-zone IPS is inferior compared to VA will far fewer zones is worrisome, as this monitor will cost me $3000 instead of just <$1500 for a 60-zone VA X930E or OLED C7. Gsync, low input lag & 144Hz are nice extras but as a non-competitive gamer I'm willing to sacrifice those for better image quality.


----------



## stefxyz

TFT Central review is up and I am checking my bank account....


----------



## deadchip12

The only review that matters is here folks: http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg27uq.htm

I think Asus intentionally hand pick and send TFTCentral the best sample in their whole factory lol:

Low glow
No backlight bleed 
110% DCI-P3 (w.t.f?)

Contrast measured is a bit low at ~3600:1 in SDR mode with FALD on (lower than native contrast of VA tvs) & ~30000:1 in HDR mode with FALD on (<50000:1 advertised).


----------



## Fanu

deadchip12 said:


> The only review that matters is here folks: http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg27uq.htm
> 
> I think Asus intentionally hand pick and send TFTCentral the best sample in their whole factory lol:
> 
> Low glow
> No backlight bleed
> 110% DCI-P3 (w.t.f?)
> 
> Contrast measured is a bit low at ~3600:1 in SDR mode with FALD on (lower than native contrast of VA tvs) & ~30000:1 in HDR mode with FALD on (<50000:1 advertised).


hence why these monitor reviews mean little to nothing 

there are too many variances between same model monitors - these reviews should ideally be done on like 10 units that were all bought from different stores
and not provided a single sample by manufacturer that has probably been handpicked cause it was a golden sample with almost no issues


----------



## kx11

deadchip12 said:


> The only review that matters is here folks: http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg27uq.htm
> 
> I think Asus intentionally hand pick and send TFTCentral the best sample in their whole factory lol:
> 
> Low glow
> No backlight bleed
> 110% DCI-P3 (w.t.f?)
> 
> Contrast measured is a bit low at ~3600:1 in SDR mode with FALD on (lower than native contrast of VA tvs) & ~30000:1 in HDR mode with FALD on (<50000:1 advertised).





mine doesn't have the backlight bleed problem , idon't understand nor care about the other stuff you mentioned , this monitor is awesome


----------



## stefxyz

Is there a list with properely supported HDR games somehwere?


----------



## kx11

stefxyz said:


> Is there a list with properely supported HDR games somehwere?



hopefully this one got all of them


https://pcgamingwiki.com/wiki/Speci...Feature-2Fintro/outrotemplate=Feature-2Foutro


----------



## Glerox

Ladies and gentlemen, I present to you :

http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg27uq.htm

Have a good read!


----------



## acmilangr

Glerox said:


> Ladies and gentlemen, I present to you :
> 
> http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg27uq.htm
> 
> Have a good read!


Thanks. But someone already mentioned. 

Great review for a great monitor.


----------



## Glerox

Oups hehe.

I've read the whole review. There is one thing that differs from my experience with the monitor.

They say that peak SDR brightness is >500 nits. My unit is clearly less than 400 nits. My guess is that the firmware update boost the SDR brightness, which I would be really happy!

I guess that because they say the default SDR brightness is set to 60 but we all know our units came with a default of 80.


----------



## deadchip12

Glerox said:


> Oups hehe.
> 
> I've read the whole review. There is one thing that differs from my experience with the monitor.
> 
> They say that peak SDR brightness is >500 nits. My unit is clearly less than 400 nits. My guess is that the firmware update boost the SDR brightness, which I would be really happy!
> 
> I guess that because they say the default SDR brightness is set to 60 but we all know our units came with a default of 80.


Yeah that would be nice if the update boost sdr brightness close to 600 nits. But I wonder if it matters though? Using 600 nits in sdr mode will be painful to the eyes no?


----------



## Glerox

600 yes but I need 400 for daylight and now it's less to my eyes.

Another I found interesting is the monitor mapping the 1000nits HDR desired brightness to around 600 nits in their testing.
They needed to output a desired 2000nits to get around 1000nits on the monitor.

I wonder if this is also part of the firmware update because in my testing with the VESA display app, I cannot see a difference once the desired output is 1000nits or above (1000nits or above seems to really give the maximum brightness of 1000nits).


----------



## deadchip12

Glerox said:


> 600 yes but I need 400 for daylight and now it's less to my eyes.
> 
> Another I found interesting is the monitor mapping the 1000nits HDR desired brightness to around 600 nits in their testing.
> They needed to output a desired 2000nits to get around 1000nits on the monitor.
> 
> I wonder if this is also part of the firmware update because in my testing with the VESA display app, I cannot see a difference once the desired output is 1000nits or above (1000nits or above seems to really give the maximum brightness of 1000nits).


Seems like you can increase the reference white setting so 1000 nits content will be shown at 1000 nits brightness. 

I wonder whether most of hdr games or movies out there are mastered at 1000, 2000 or 4000 nits


----------



## kx11

Glad to see Netflix 4k working


----------



## deadchip12

So should we increase the reference white in the osd from 80 to the max? 1000 nits content will be shown at 1000 nits properly instead of 600 nits likw what was mentioned in tftcentral review


----------



## acmilangr

deadchip12 said:


> So should we increase the reference white in the osd from 80 to the max? 1000 nits content will be shown at 1000 nits properly instead of 600 nits likw what was mentioned in tftcentral review


Tftcentral recomend to let it at default 80


----------



## deadchip12

acmilangr said:


> deadchip12 said:
> 
> 
> 
> So should we increase the reference white in the osd from 80 to the max? 1000 nits content will be shown at 1000 nits properly instead of 600 nits likw what was mentioned in tftcentral review
> 
> 
> 
> Tftcentral recomend to let it at default 80
Click to expand...

Yes but I don't understand why though. Keep it at 80 will tone map 1000 nits content to 600 nits brightness, leading to lost details. They say increase it to 168 will make the monitor output 1000 nits for 1000 contents with the downside of < 600 nits contents getting brighter than they should, but there is no hdr content mastered at < 600 nits so this is not an issue right? Could someone explain?


----------



## Malinkadink

So have there been any announcements for displays to expect in 2019 that are 4k 144hz? I know 32" 4k is coming late this year but it'll be the same as these monitors just bigger and i have my issues with these displays mainly bandwidth constraints putting a limit on them and im not cool with paying $2k just to be limited by bandwidth of DP 1.4. 

DP 1.5 won't be here next year or even 2020 i dont think, HDMI 2.1 can start appearing in displays in 2019 if the manufacturers want it to be, but i don't see Nvidia developing a gsync module to work with HDMI 2.1 so we can make use of 4k 144hz HDR 4:4:4 with no compromise. Basically it looks there wont be a display that i will actually be willing to buy until 2020 the earliest. Pretty crappy circumstances, but im damn happy with my OLED C7 for multimedia use, and only need to fallback on the S2417DG for competitive use. Still, i want to have my cake and eat it too, meaning i want a monitor thats actually a perfectly viable competitive display, while also looking really good with high contrast etc.


----------



## Glerox

deadchip12 said:


> Yes but I don't understand why though. Keep it at 80 will tone map 1000 nits content to 600 nits brightness, leading to lost details. They say increase it to 168 will make the monitor output 1000 nits for 1000 contents with the downside of < 600 nits contents getting brighter than they should, but there is no hdr content mastered at < 600 nits so this is not an issue right? Could someone explain?


HDR content is mastered in absolute number of nits, from 0 to whatever maximum nits you set in the in-game settings. So you definitely don't want to loose the details of intended dark areas below 600 nits and that's why you should stick to the default of 80.

To my understanding, if you want the monitor to display its maximum range of 0 to 1000nits, you have to increase the in-game maximum HDR brightness (eg BF1) setting to 2000nits or above according to the data on tftcentral. However, this might be true only for the upcoming firmware update because I feel my unit really displays 1000nits when 1000nits signal is given in the VESA hdr display app.

They probably lowered the brightness in the upcoming firmware because some games are really too bright on large areas (eg Farcry 5) to be played 2 feet away. 1000nits should be reserved for really small areas like flashes, fire, sun etc.


----------



## deadchip12

Glerox said:


> deadchip12 said:
> 
> 
> 
> Yes but I don't understand why though. Keep it at 80 will tone map 1000 nits content to 600 nits brightness, leading to lost details. They say increase it to 168 will make the monitor output 1000 nits for 1000 contents with the downside of < 600 nits contents getting brighter than they should, but there is no hdr content mastered at < 600 nits so this is not an issue right? Could someone explain?
> 
> 
> 
> HDR content is mastered in absolute number of nits, from 0 to whatever maximum nits you set in the in-game settings. So you definitely don't want to loose the details of intended dark areas below 600 nits and that's why you should stick to the default of 80.
> 
> To my understanding, if you want the monitor to display its maximum range of 0 to 1000nits, you have to increase the in-game maximum HDR brightness (eg BF1) setting to 2000nits or above according to the data on tftcentral. However, this might be true only for the upcoming firmware update because I feel my unit really displays 1000nits when 1000nits signal is given in the VESA hdr display app.
Click to expand...

What about 4k hdr movies? Lots of them are mastered at 1000 nits. So if I set reference white to 80, dark areas will not lose details but anything above 600 nits will be tone mapped and so bright hightlights are lost?


----------



## Glerox

deadchip12 said:


> What about 4k hdr movies? Lots of them are mastered at 1000 nits. So if I set reference white to 80, dark areas will not lose details but anything above 600 nits will be tone mapped and so bright hightlights are lost?


yes it seems so... unfortunately.


----------



## deadchip12

Glerox said:


> deadchip12 said:
> 
> 
> 
> What about 4k hdr movies? Lots of them are mastered at 1000 nits. So if I set reference white to 80, dark areas will not lose details but anything above 600 nits will be tone mapped and so bright hightlights are lost?
> 
> 
> 
> yes it seems so... unfortunately.
Click to expand...

This is just freaking weird. What were Asus thinking?


----------



## acmilangr

They probably lowered the brightness in the upcoming firmware because some games are really too bright on large areas (eg Farcry 5) to be played 2 feet away. 1000nits should be reserved for really small areas like flashes, fire, sun etc.[/QUOTE]

this.


----------



## acmilangr

from the review...
"When you enable the HDR mode in the OSD menu..."

Do we have any option enabling this?


----------



## fleggy

No, just Wide Gamut for Display SDR Input.
BTW I've succesfully reproduced "broken Fast FALD" in MGS V: The Phantom Pain on my son's rig. The report already sent to Asus support. I hope they will be able to fix it in some incoming FW.
If anybody is interested in - set Fast FALD in OSD, start the game, in its options change the screen mode from Borderless Fulscreen to Fullscreen and restart the game.


----------



## stefxyz

I would not get too hung up on the les sthan 1000 nits thing. LG Oleds also push 700 max still in a dark room I prefer the HDR to the 1000 plus Sammies due to superioir contrast. Also 1000 nits is extremely bright. Even 600 will flash you quite a bit esp if you sit as close as you do on a PC monitor. What I personally dont like is the fact that this is HDR10 which is already kinfd of outdated on TVs. HDR10 has no metadata but the post has to adjust brightness for the full media, while Dolby Vision and now HDR10+ can adjust for every szene. This makes a huge difference actually....


----------



## kx11

why do people care so much about eye blinding brightness ? 



my SDR brightness settings are never above 40 mostly 37


----------



## Glerox

kx11 said:


> why do people care so much about eye blinding brightness ?
> 
> 
> 
> my SDR brightness settings are never above 40 mostly 37


Sometimes, I play in daylight with the sun directly on my monitor so I need higher brightness or i don't see anything.


----------



## deadchip12

Glerox said:


> kx11 said:
> 
> 
> 
> why do people care so much about eye blinding brightness ?
> 
> 
> 
> my SDR brightness settings are never above 40 mostly 37
> 
> 
> 
> Sometimes, I play in daylight with the sun directly on my monitor so I need higher brightness or i don't see anything.
Click to expand...

Hey man. You said earlier in the VESA app you can see 1000 nits content are shown correctly at 1000 nits? How do you check?


----------



## Aristotelian

Came here to post the tftcentral link, and glad it is already shared here.

Not surprised that the review was immediately dismissed by people, though they don't seem to understand the severity of their claims (that Asus would send a cherry picked sample to a reviewer). 

Also not surprised that of the hundreds of pages in this thread, at least 100 were devoted to "nobody's going to buy it, it's too small, it's too expensive" and now that even a comprehensive review out the coping mechanism becomes "yeah it's a cherry picked sample". Very, very productive.

Thanks to CallSignVega and others who actually own the monitor for giving their impressions here rather than the theorycrafting naysayers. I think I'll pick up this monitor early next year - I'll try to time it around the release of the 1180Ti, but let's see if I ever stop 'waiting for the next gen' which I have been for a while to completely replace my current rig.


----------



## tinykitten

I got a flawless panel (to my eye atleast) on my second attempt. I assume the fan blades were touching a cable on my unit which caused a very annoying clicking sound. That obviously would have been a reason to return yet another monitor, thankfully that issue seems to have corrected itself after a few knocks on the back wherever the fan is located. That clicking sound hasn't returned for a few weeks now, so far so good.


After using the monitor for a while now I have a few questions.
- Considering tftcentrals monitor with the new firmware: has there been any update on when the firmware update will be available to existing owners? 
- Is it normal that the exhaust around the area where you plug in the cables gets super hot? I removed the cable shroud just in case so the hot air isn't as concentrated around the cables and whatnot. The fan is definitely running for what it's worth. 
- I tested 144hz to see if I got a model with the updated firmware, that wasn't the case but no big deal. I switched back to 120hz in the Nvidia control panel and minded my own business. Eventually later on I started a game and thought wait a second, the colors don't seem right at all. Turns out when starting the game the monitor switched back to 144hz mode. A workaround for me was to obviously disable the 144hz overclock which leaves everything at a comfortable 120hz without black crush, however this got me a little curious. I tried this on 98hz 10 bit as well whenever I would load a game the monitor would switch to 120hz 8bit. Is it normal behavior that the monitor tries to switch to the highest possible refresh rate (98hz -> 120hz; 120hz -> 144hz (assuming overclock enabled))? Is there any way to control this?


----------



## Malinkadink

Aristotelian said:


> Came here to post the tftcentral link, and glad it is already shared here.
> 
> Not surprised that the review was immediately dismissed by people, though they don't seem to understand the severity of their claims (that Asus would send a cherry picked sample to a reviewer).
> 
> Also not surprised that of the hundreds of pages in this thread, at least 100 were devoted to "nobody's going to buy it, it's too small, it's too expensive" and now that even a comprehensive review out the coping mechanism becomes "yeah it's a cherry picked sample". Very, very productive.
> 
> Thanks to CallSignVega and others who actually own the monitor for giving their impressions here rather than the theorycrafting naysayers. I think I'll pick up this monitor early next year - I'll try to time it around the release of the 1180Ti, but let's see if I ever stop 'waiting for the next gen' which I have been for a while to completely replace my current rig.


My fundamental issue with this monitor is how cucked it is purely from having DP 1.4 and therefore not having enough bandwidth for 4k 144hz 10 bit HDR with 4:4:4. They didn't added DSC, and they didn't try to work out a solution for using double DP 1.4 to provide enough grunt to make this monitor worth it in my eyes. Yeah the price is a little high, but i'd have gone for one if it managed to deliver a more full experience. Next year when or if they release a monitor that satisfies 4k 144hz 10 bit HDR 4:4:4 i'll be all over that, but then actually at that point i'd also want it to also at least have miniLED, if its still a normal LED LCD i'd probably pass but only because i have my OLED for when i want to look at something pretty, and a proper gaming monitor when i want to get serious.

EDIT: Ooooo forgot to add its also 8 bit + FRC, not even a native 10 bit panel and they're looking to get $2000 for them? Cost cutting at its finest, from the 8 bit panel, to the small cooling fan for the gsync module vs having a slightly bigger enclosure to house a larger passive cooling heatsink.


----------



## Morkai

tinykitten said:


> I tried this on 98hz 10 bit as well whenever I would load a game the monitor would switch to 120hz 8bit. Is it normal behavior that the monitor tries to switch to the highest possible refresh rate (98hz -> 120hz; 120hz -> 144hz (assuming overclock enabled))? Is there any way to control this?


Mine does not switch anything automatically. There is a nvidia driver setting for refresh rate; yours is probably set to "prefer highest available", change it to "application controlled" ?


----------



## Glerox

deadchip12 said:


> Hey man. You said earlier in the VESA app you can see 1000 nits content are shown correctly at 1000 nits? How do you check?


You just run the test with HDR enabled in Windows and one of the test is a black screen with multiples little rectangles of different nits. Me I see that for 1000nits and above it all seems to be correctly displaying the maximum brightness of the monitor.


----------



## l88bastar

tinykitten said:


> I got a flawless panel (to my eye atleast) on my second attempt. I assume the fan blades were touching a cable on my unit which caused a very annoying clicking sound. That obviously would have been a reason to return yet another monitor, thankfully that issue seems to have corrected itself after a few knocks on the back wherever the fan is located. That clicking sound hasn't returned for a few weeks now, so far so good.


I fixed the fan cable clicking sound with this sweet futuristic looking solution!


----------



## kx11

Glerox said:


> Sometimes, I play in daylight with the sun directly on my monitor so I need higher brightness or i don't see anything.





maybe in your case , normally gamers are very active during night time


----------



## deadchip12

I just came across this research paper: https://www.osapublishing.org/Direc...6-13-16572.pdf?da=1&id=390602&seq=0&mobile=no

Basically, the result says the number of dimming zones required for unnoticeable halo effects depends greatly on the native contrast ratio of the panel. The exact words: *"If a LCD with CR≈1000:1, then even 10,000 local dimming zones is still inadequate. For a LCD with CR≈2000:1, (e.g., fringing-field switching (FFS) LCD), the required local dimming zones is reduced to 3000. If a LCD with CR = 5000 (e.g. MVA), then an unnoticeable halo effect can be achieved at ~200 local dimming zones)"*

So based on this research, seems like the IPS panel really kills this monitor's 384 dimming zones potential. The upcoming 512-zone VA will be better though, but with CR~2000-3000 it will still only be on the same level as <100-zone VA TVs with CR~6000.


----------



## Glerox

Good news, TFTcentral updated their review on the HDR brightness mapping problem.

Their previous results were via HDMI. They retested with Displayport and now 1000 nits signal correctly outputs 1000 nits on the monitor (which corresponds to what my unit does as I said).
Interestingly, they say "We carried out these measurements first of all with the default 80 'white reference' setting but found that the content targets were being exceeded by quite a lot and the screen was basically too bright. Content mastered at 400 cd/m2 was being shown at around 650 cd/m2. You will probably want to lower the white reference setting to 52 in the OSD menu, and that then produced the results shown above which were nice and accurate."

Will try that!


----------



## Glerox

Good news, TFTcentral updated their review and the HDR brightness mapping is now corrected (1000nits signal output = 1000nits on the display).

Interestingly, they say "We carried out these measurements first of all with the default 80 'white reference' setting but found that the content targets were being exceeded by quite a lot and the screen was basically too bright. Content mastered at 400 cd/m2 was being shown at around 650 cd/m2. You will probably want to lower the white reference setting to 52 in the OSD menu, and that then produced the results shown above which were nice and accurate."

Will try!


----------



## acmilangr

Really good news. So at 52 will be accurate.

But i have a question. 
So the brightness is higher on SDR (about 500 nits) than HDR with 400nits Signal?


----------



## deadchip12

For those of you who play Assassin's Creed Origins, what are your settings for in-game brightness, maximum luminance and paper white? And what is your setting for reference white in the monitor OSD?

What a mess this whole hdr thing is.


----------



## Glerox

acmilangr said:


> Really good news. So at 52 will be accurate.
> 
> But i have a question.
> So the brightness is higher on SDR (about 500 nits) than HDR with 400nits Signal?


Yes


----------



## fleggy

deadchip12 said:


> For those of you who play Assassin's Creed Origins, what are your settings for in-game brightness, maximum luminance and paper white? And what is your setting for reference white in the monitor OSD?
> 
> What a mess this whole hdr thing is.


I have the same problem. Many times I tried different combinations of desktop color depth, RR, SDR/HDR and just once I got the correct HDR image in the game (in game HDR options on default values) . I gave it up.


----------



## kx11

i suggest you guys watch the movie Dunkirk 4k hdr , lost of very dark scenes with bright spots 



good test for this monitor


----------



## deadchip12

kx11 said:


> i suggest you guys watch the movie Dunkirk 4k hdr , lost of very dark scenes with bright spots
> 
> 
> 
> good test for this monitor


Will do. My favorite movie.


----------



## acmilangr

Dont forget that on tftcentral they have monitor with the latest firmware. So maybe on ours 52 option will have different result


----------



## profundido

deadchip12 said:


> Will do. My favorite movie.


so funny. That movie was shot barely 80km away from my house and I haven't even seen it yet  I guess I should


----------



## kx11

deadchip12 said:


> Will do. My favorite movie.





i loved the directing by Nolan


----------



## deadchip12

kx11 said:


> i loved the directing by Nolan


I love Nolan


----------



## kot0005

I knew Tftcentral was missing something..

they updated the review.
SDR contrast ratio is 25,000+ :1 and HDR is 60,000+ :1

now show me a monitor with 20,000:1 SDR contrast..


----------



## deadchip12

kot0005 said:


> I knew Tftcentral was missing something..
> 
> they updated the review.
> SDR contrast ratio is 25,000+ :1 and HDR is 60,000+ :1
> 
> now show me a monitor with 20,000:1 SDR contrast..


I think TFTCentral's method overestimates the contrast? They calculate the contrast ratio using one bright sample and the black one furthest away while Rrings use checkerboard pattern


----------



## MistaSparkul

deadchip12 said:


> I think TFTCentral's method overestimates the contrast? They calculate the contrast ratio using one bright sample and the black one furthest away while Rrings use checkerboard pattern


20,000:1 is when you have the brightness cranked up to max. At 120 nits the contrast is 6,000:1 which is still very good. Beats out every VA monitor without any of the drawbacks. Maybe the checkerboard pattern will lower it a bit but the contrast should still at the very least match a decent VA panel.


----------



## Ford8484

holy ****- I didnt know the contrast was that high in SDR.....that's impressive. I thought it was around VA levels in SDR with FALD on.


----------



## acmilangr

This monitor only suffers from blooming. Except that this is a great monitor. Actually the Best monitor out there.


----------



## Ford8484

acmilangr said:


> This monitor only suffers from blooming. Except that this is a great monitor. Actually the Best monitor out there.


yea thats its only con....not nearly as bad as older IPS panels with BLB and IPS glow. Even then its only really noticeable in dark games (or dark areas). Speaking of experience with the X27- but I figure thats relevant enough to this tread, lol.


----------



## Babryn25

Anybody here with iTunes on Windows? Strangest thing, when I click Buy button on a movie in iTunes display connection crashes, all goes blank. No problems with old monitor.


----------



## bee144

To those who had issues with me when using Battlefield 1 with G-Sync HDR and SLI....

NVIDIA responded saying the July Battlefield 1 update resolved the issue for them. I performed my own test and do not agree when using 398.86 and 398.82.

I still have low FPS (70 FPS) and low GPU usage on each card.

I asked NVIDIA to clarify. Can anyone verify on whether or not the issue was resolved/not resolved.


----------



## Malinkadink

bee144 said:


> To those who had issues with me when using Battlefield 1 with G-Sync HDR and SLI....
> 
> NVIDIA responded saying the July Battlefield 1 update resolved the issue for them. I performed my own test and do not agree when using 398.86 and 398.82.
> 
> I still have low FPS (70 FPS) and low GPU usage on each card.
> 
> I asked NVIDIA to clarify. Can anyone verify on whether or not the issue was resolved/not resolved.


SLI is dead imo, best to just get the single most powerful gpu u can, way less headaches.


----------



## Ford8484

Agree- speaking of which, really curious to see what this "1180" will do. Nvidia has virtually no competition save the mid-range level so its curious what they'll do....I think I'll stick with my 1080ti though until 1180ti, unless 1180 is substantially better.


----------



## kx11

SLi is truly dead since Titan XP came out


----------



## Glerox

I fear the 1180 will have a new kind of SLI because of the new connector seen on PCB shots... 

I wonder even if it will have a "hardware" SLI which means two gpus can work like one without any game drivers needed (like they did with the new Quadros).

If it's the case, than the "old" SLI support is truly dead... and in the best case scenario the support will continue to suck.

I have two Titan XP and this is my last SLI setup unless they put out a "hardware" SLI that would work for all games. I hope not because I don't want to buy two more gpus lol.


----------



## Malinkadink

Ford8484 said:


> Agree- speaking of which, really curious to see what this "1180" will do. Nvidia has virtually no competition save the mid-range level so its curious what they'll do....I think I'll stick with my 1080ti though until 1180ti, unless 1180 is substantially better.


With the release of monster hunter world I'm actually now looking forward to an upgrade for my 1080. I can run the game at 1440p 60fps at ultra fairly consistently with my 4.8ghz 7700k and 1080, but it does dip into the 50s at times and also go into 60-70 at other moments. I could alter some settings a little to hit a solid 60fps at all times, but I'm actually looking to run the game at 4k 60hz so i can play on the OLED with a gamepad, and will need more grunt for that, something like an 1180Ti should do.


----------



## bmgjet

Never get the non ti cards and if you are wait for the ti to come out other wise Nvidia just milks it too hard.
Iv used SLI and CF since Voodo days and its just gone down hill majorly for Nvidia since 9XX series. I promised myself I wouldnt get it again after only have 20% scaling with 2X 980ti. But stupidly fell for it again with the 1080ti.
Only 1 game works with it that I currently play which is BF1. And with HDR on it only has 8% scaling, With HDR off its 10-15% scaling. AMD CF still has a bit of support but then they are sort of stuck with it since its only way they can get 1080ti performance.


----------



## profundido

fresh reading guys !

https://www.guru3d.com/articles-pages/asus-rog-swift-pg27uq-monitor-review,1.html


----------



## Bloodmosher

Bloodmosher said:


> Wow what a difference! I received my 2nd PG27UQ today. No issues with white color consistency on the desktop and I can easily match the white of my PG27AQ in SDR. So far so good, and it looks like i'll be returning the X27 in favor of this one.


Finally, monitor #4 gets it right. I received another X27 yesterday, and it has the same great picture as my 2nd PG27UQ (no white issues, etc), but the fan is SOOOO much better. I also prefer its ambient lighting, and lower joystick position. This one must have a different firmware because the menu looks different than the last one. Given the fan issue, I think for me the one to keep is the X27.


----------



## profundido

Bloodmosher said:


> Finally, monitor #4 gets it right. I received another X27 yesterday, and it has the same great picture as my 2nd PG27UQ (no white issues, etc), but the fan is SOOOO much better. I also prefer its ambient lighting, and lower joystick position. This one must have a different firmware because the menu looks different than the last one. Given the fan issue, I think for me the one to keep is the X27.


I would be very interested in hearing anything that you notice to be changed in this firmware


----------



## Bloodmosher

profundido said:


> I would be very interested in hearing anything that you notice to be changed in this firmware


I think it might just be the max nits in SDR- I am pretty sure my first one only went to 300, whereas this one goes up to 500. Also if anyone cares, my 2nd PG27UQ had an option to turn off that annoying HDR warning message, which I believe the first did not have.


----------



## Morkai

Bloodmosher said:


> I think it might just be the max nits in SDR- I am pretty sure my first one only went to 300, whereas this one goes up to 500. Also if anyone cares, my 2nd PG27UQ had an option to turn off that annoying HDR warning message, which I believe the first did not have.


Nice, that message is really annoying. Could the 2nd start sdr games in hdr without problems? (If I start a sdr game, say WoW, with hdr enabled in windows, my monitor gets really angry and spams the hdr message every few sec).


----------



## Bloodmosher

Morkai said:


> Nice, that message is really annoying. Could the 2nd start sdr games in hdr without problems? (If I start a sdr game, say WoW, with hdr enabled in windows, my monitor gets really angry and spams the hdr message every few sec).


Hmm haven't tried that on either X27 or PG27UQ. I haven't found a reason to leave the desktop in HDR yet (I don't watch HDR video on this thing often enough).


----------



## kot0005

Whelp MHW is a trash port..magical nvidia driver does nothing. Only 40fps for okish looking game on 1080ti...


----------



## fleggy

Bloodmosher said:


> ... Also if anyone cares, my 2nd PG27UQ had an option to turn off that annoying HDR warning message, which I believe the first did not have.


Where is this option in your ODS? And what about FW version - did they add this information to System Setup? Thanks


----------



## Foxrun

kot0005 said:


> Whelp MHW is a trash port..magical nvidia driver does nothing. Only 40fps for okish looking game on 1080ti...


Haha it's not a trash port. There is a setting, diffuse etc, that should be reduced to med or low. It will help alot with fps.


----------



## Bloodmosher

fleggy said:


> Where is this option in your ODS? And what about FW version - did they add this information to System Setup? Thanks


I've now sent the PG27UQ back, so I cannot verify, but I think it was under System Setup...


----------



## kx11

kot0005 said:


> Whelp MHW is a trash port..magical nvidia driver does nothing. Only 40fps for okish looking game on 1080ti...



turn off volumetric lighting


----------



## bmgjet

Bloodmosher said:


> I've now sent the PG27UQ back, so I cannot verify, but I think it was under System Setup...


Why have you sent it back this time? It seems every time your questioned about something you dont have the screen on you because its been sent back.
This would be your 5th time if you look back on your previous posts.

----

Enjoying my 1st one so much I got a 2nd so now im running 2X 4K and 1X 1440p screens.
I seem to of hit a bandwidth problem with my 1080ti. Where my first screen and 1440p screen have lost 20hz overclock. Use to run 120hz / 144hz.
Max speeds I can get now are, 1440p 100hz / 4K 119hz / 4K 119hz.


----------



## acmilangr

https://www.guru3d.com/index.php?ct=articles&action=file&id=42891

Look the difference on OSD


----------



## Bloodmosher

bmgjet said:


> Why have you sent it back this time? It seems every time your questioned about something you dont have the screen on you because its been sent back.
> This would be your 5th time if you look back on your previous posts.
> 
> ----
> 
> Enjoying my 1st one so much I got a 2nd so now im running 2X 4K and 1X 1440p screens.
> I seem to of hit a bandwidth problem with my 1080ti. Where my first screen and 1440p screen have lost 20hz overclock. Use to run 120hz / 144hz.
> Max speeds I can get now are, 1440p 100hz / 4K 119hz / 4K 119hz.


Just to make it clear, here's my history with these:
1. PG27UQ: returned due to inconsistent white on the desktop. Top half white, bottom half had a yellow tone to it. Very noticeable when reading documents/email/spreadsheets.
2. X27: better white consistency but not perfect, returned for this reason
3. PG27UQ: great white consistency and overall pleased with the panel, but the fan was whiny and loud. Returned for this reason.
4. X27: great panel, quiet fan. This is the one I have kept.


----------



## kot0005

kx11 said:


> turn off volumetric lighting





Foxrun said:


> Haha it's not a trash port. There is a setting, diffuse etc, that should be reduced to med or low. It will help alot with fps.


It is with volumetric off. TFT central has a review up..
Digital Foundry did an analysis..it really is a bad port. Cant even get stable 60fps on 1070 at 1080p.. Has some bugs in textures.

Running it at 1600p on this monitor atm. Only get around 70fps on my 1080ti with volumetric off


----------



## Foxrun

kot0005 said:


> It is with volumetric off. TFT central has a review up..
> Digital Foundry did an analysis..it really is a bad port. Cant even get stable 60fps on 1070 at 1080p.. Has some bugs in textures.
> 
> Running it at 1600p on this monitor atm. Only get around 70fps on my 1080ti with volumetric off



It's most certainly with diffuse for me. I can get 4k 60+ with diffuse set to mid.


----------



## deadchip12

Based on tftcentral review, hdr contents via hdmi that are mastered at 1000 nits can only be shown at 600 nits if the reference white setting is set to 80. To properly display those at 1000 nits, we need to either increase or decrease the setting. However, decrease it to 52 will make 100-400 nits content darker than they should be and increase it to 168 will make those contents brighter than they should be. Anyone here will a game console can test and see what is the optimal reference white so that all luminance levels are correctly displayed? It's weird this does not happen to displayport and setting reference white to 52 will display everything as intended.


----------



## HyperMatrix

Anyone else have damage to their screen (bubbling/wrinkling) or is it just me so far? The spot where the FPGU is gets so hot, that it can hit close to 50 degrees Celsius. In this pic it's at 43.9 degrees, while the other areas of the screen are 30-35 degrees. This is in an air conditioned 21 degrees room. 

https://imgur.com/8dd0PzT









As you can see, it's caused those horizontal lines/wrinkles in the polarizer in that spot so it's basically useless. Talking to ASUS now but they say if you've had the monitor serviced/opened anywhere, they void the warranty. I told them about the Magnusson-moss warranty act. But I'm in Canada. So. I have a giant vagine for a prime minister, and 0 consumer protection measures. They said they'll get back to me. At this point thinking of using Mastercard purchase assurance to claim physical damage. But will see how it pans out. 

From stories I've heard in the past about ASUS warranty, it's supposed to be one of the worst experiences you can have. So I'm not expecting them to cooperate at all. I was also looking at paying to have the screen polarizer replaced with one bonded with a glossy coating when the damage first started to pop up a few days ago. But considering the excessive heat this unit is showing in that zone, that won't be an option. Because it'll repeat. So I'm curious what could be causing it. Like I said the room is air conditioned to between 68-72 degrees Fahrenheit, so 20-22 Celsius. I doubt it would be a fan issue unless they didn't put any overheat protection mechanisms in the monitor so the fan works, but doesn't care how hot the FPGU gets. And that would be pretty idiotic. It could also be a lack of some sort of heat shield directly between the panel and the FPGU. 


Either way, not impressed with the engineering on this monitor. Now I know why it was delayed for over a year.


----------



## Glerox

HyperMatrix said:


> Anyone else have damage to their screen (bubbling/wrinkling) or is it just me so far? The spot where the FPGU is gets so hot, that it can hit close to 50 degrees Celsius. In this pic it's at 43.9 degrees, while the other areas of the screen are 30-35 degrees. This is in an air conditioned 21 degrees room.
> 
> https://imgur.com/8dd0PzT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As you can see, it's caused those horizontal lines/wrinkles in the polarizer in that spot so it's basically useless. Talking to ASUS now but they say if you've had the monitor serviced/opened anywhere, they void the warranty. I told them about the Magnusson-moss warranty act. But I'm in Canada. So. I have a giant vagine for a prime minister, and 0 consumer protection measures. They said they'll get back to me. At this point thinking of using Mastercard purchase assurance to claim physical damage. But will see how it pans out.
> 
> From stories I've heard in the past about ASUS warranty, it's supposed to be one of the worst experiences you can have. So I'm not expecting them to cooperate at all. I was also looking at paying to have the screen polarizer replaced with one bonded with a glossy coating when the damage first started to pop up a few days ago. But considering the excessive heat this unit is showing in that zone, that won't be an option. Because it'll repeat. So I'm curious what could be causing it. Like I said the room is air conditioned to between 68-72 degrees Fahrenheit, so 20-22 Celsius. I doubt it would be a fan issue unless they didn't put any overheat protection mechanisms in the monitor so the fan works, but doesn't care how hot the FPGU gets. And that would be pretty idiotic. It could also be a lack of some sort of heat shield directly between the panel and the FPGU.
> 
> 
> Either way, not impressed with the engineering on this monitor. Now I know why it was delayed for over a year.


For what reason did you already had the monitor serviced/opened ?


----------



## moonbogg

HyperMatrix said:


> Anyone else have damage to their screen (bubbling/wrinkling) or is it just me so far? The spot where the FPGU is gets so hot, that it can hit close to 50 degrees Celsius. In this pic it's at 43.9 degrees, while the other areas of the screen are 30-35 degrees. This is in an air conditioned 21 degrees room.
> 
> https://imgur.com/8dd0PzT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As you can see, it's caused those horizontal lines/wrinkles in the polarizer in that spot so it's basically useless. Talking to ASUS now but they say if you've had the monitor serviced/opened anywhere, they void the warranty. I told them about the Magnusson-moss warranty act. But I'm in Canada. So. I have a giant vagine for a prime minister, and 0 consumer protection measures. They said they'll get back to me. At this point thinking of using Mastercard purchase assurance to claim physical damage. But will see how it pans out.
> 
> From stories I've heard in the past about ASUS warranty, it's supposed to be one of the worst experiences you can have. So I'm not expecting them to cooperate at all. I was also looking at paying to have the screen polarizer replaced with one bonded with a glossy coating when the damage first started to pop up a few days ago. But considering the excessive heat this unit is showing in that zone, that won't be an option. Because it'll repeat. So I'm curious what could be causing it. Like I said the room is air conditioned to between 68-72 degrees Fahrenheit, so 20-22 Celsius. I doubt it would be a fan issue unless they didn't put any overheat protection mechanisms in the monitor so the fan works, but doesn't care how hot the FPGU gets. And that would be pretty idiotic. It could also be a lack of some sort of heat shield directly between the panel and the FPGU.
> 
> 
> Either way, not impressed with the engineering on this monitor. Now I know why it was delayed for over a year.



I don't see any horizontal lines except the ones that are the same as the diagonal lines. Is that part of the game? I don't see which lines you are talking about. Did you remove the AG coating?


----------



## HyperMatrix

Glerox said:


> For what reason did you already had the monitor serviced/opened ?


AG film removal



moonbogg said:


> I don't see any horizontal lines except the ones that are the same as the diagonal lines. Is that part of the game? I don't see which lines you are talking about. Did you remove the AG coating?


Yes I did remove the AG film. There was no issue with that process and It's been working perfectly for a month. But a few days ago, this happened. And I saw it's a result of the excess heat from the FPGA. 50 degrees celsius is far too high for such thin plastic. I've circled the lines I'm talking about in the picture. They're literally wrinkle lines caused from expansion/contraction caused by the excessive heat. And it only happened to the area with such high heat present. Nowhere else. Either way, Mastercard purchase assurance covers the damage even if it's deemed "physical damage" by ASUS (which would be illegal for them to do under US law). But the problem here is that there's a potential serious flaw with the thermal design of this monitor if the surface can reach as high at 50 degrees celsius. That's 122 degrees Fahrenheit for my American friends. Even a properly cooled CPU doesn't get that hot.

https://imgur.com/a/fqZtP6J


----------



## bmgjet

Just checked my screen and around middle its 45C, rest of screen 20-30c, Up the vent hole its 52c.
No issues at all on mine but I guess its something I should keep a eye on now.


----------



## Malinkadink

bmgjet said:


> Just checked my screen and around middle its 45C, rest of screen 20-30c, Up the vent hole its 52c.
> No issues at all on mine but I guess its something I should keep a eye on now.


Don't remove the AG coating and you'll be fine. Removing the AG is effectively removing an extra layer that acts like a heat soak for the FPGA and during heating/cooling cycles the films expand and contract as a single unit and since the AG is adhered to the polarizer it will not let that lower layer wrinkle unless the AG was to wrinkle itself which i doubt it will as its a much tougher film. 

I'm not sure what the warranty terms are exactly, but i'd file this under user inflicted physical damage and not cover the damages. I've removed AG films before and paid the price on one where the LCD actually ended up cracking. If i were OP i'd go through with a new polarizer with glossy coating replacement and hope that glossy layer is resilient enough to not wrinkle under the heat.


----------



## acmilangr

I have removed also the antiglare coating and i have issues with the polarizer. I damaged polarizer becouse i Cleared with Water and it is sensitive with Water. I didnt know that. 

I have ordered some polarizers (glossy and Matt) from china and let you know how that goes. I have also ordered a film protection to put in front of polarizer.


----------



## kot0005

YES!! do not send ur monitors for service/repair!!! The techs will destroy your monitor. Get ready to find lots of scratches dents and other crap. My brand new Acer x34 was in a horrible shape and they didnt even fix the issue.


----------



## bmgjet

kot0005 said:


> YES!! do not send ur monitors for service/repair!!! The techs will destroy your monitor. Get ready to find lots of scratches dents and other crap. My brand new Acer x34 was in a horrible shape and they didnt even fix the issue.


This so much. Sent a screen for BLB got a screen back with same BLB but chipped up plastics from where the pryied it open and power cord hole lose since wasnt cliped back in properly.


----------



## HowHardCanItBe

Cleaned and reopened. Folks, please don't accuse anyone of committing any sort of fraud publicly. If you feel that someone is not doing the right thing, please report it and move on. 

Cheers


----------



## deadchip12

Damn Rtings just destroyed Acer X27: https://www.rtings.com/monitor/reviews/acer/predator-x27

Doubt PG27UQ can fare better


----------



## acmilangr

Trust tftcentral. Nothing else


----------



## CallsignVega

deadchip12 said:


> Damn Rtings just destroyed Acer X27: https://www.rtings.com/monitor/reviews/acer/predator-x27
> 
> Doubt PG27UQ can fare better


8.0 rating is "destroyed"? Hyperbole much?

And the review has massive flaws. 1500 contrast ratio with FALD enabled? That is literally impossible. And then they test "black uniformity" with the FALD off (a mode no one would ever use) and give that section a 3.7. 

And then at the bottom for HDR gaming then say it is barely better than the Samsung CHG70 for HDR gaming. Barely better than a fake HDR cheap garbage monitor? LOL. Totally clueless, RTINGs has lost a lot of respect in my eyes. Refer to TFTCentrals review of the ASUS which is an identical panel with identical NVIDIA firmware.

TFTCentral: Overall the PG27UQ (same as X27) is the best gaming screen we've tested to date all-round. If you can afford the very high price point and want the Rolls Royce of gaming monitors then get it!


----------



## kx11

this flickering is so bad i hate it , good thing it doesn't appear in games


----------



## NewType88

@kx11 you think 1000 zones in the upcoming 32”panel will still have that effect just in smaller squares ? Or do you think that many zones is enough to smooth/blend the zones from flickering ?

That’s the main reason I didn’t get it because it was very obvious at micro center, granted you said it’s good where it counts, but that’s a lot of money for me to be annoyed while doing everything else.


----------



## animeowns

kx11 said:


> this flickering is so bad i hate it , good thing it doesn't appear in games
> 
> 
> 
> https://www.youtube.com/watch?v=ZPcisNbIRPM


I don't have that problem with mine


----------



## animeowns

NewType88 said:


> @kx11 you think 1000 zones in the upcoming 32”panel will still have that effect just in smaller squares ? Or do you think that many zones is enough to smooth/blend the zones from flickering ?
> 
> That’s the main reason I didn’t get it because it was very obvious at micro center, granted you said it’s good where it counts, but that’s a lot of money for me to be annoyed while doing everything else.


I got my unit from microcenter online while it was on sale new for $1870 and I have not had any problems with my display


----------



## NewType88

animeowns said:


> I got my unit from microcenter online while it was on sale new for $1870 and I have not had any problems with my display



Well the "flickering" is just the zones lighting up, so yours should do that too, yes ? Ya, my microcenter has had it on sale for 1799 for a while. I check the online store for my location everyday and for the past week or so they have had 2 units, then 1, then the next day they have 2 again(x27). Its happend like three times. I just wonder if those are returns ?


Thought about haggeling them to get it to 1500, still might do that and tell the guy ill buy the rtx 20** series from them too, see if theyll go for it.


----------



## dboythagr8

This is a large thread so sorry for asking what may have been answered. I'm considering getting this today from Microcenter, and a 2080 Ti whenever it is announced. I've been hearing something about a firmware update needed for this monitor? What's that about, and does it affect the Acer X27 as well? Also is anybody using this monitor for HDR console gaming? If so how does it fare?


----------



## acmilangr

dboythagr8 said:


> This is a large thread so sorry for asking what may have been answered. I'm considering getting this today from Microcenter, and a 2080 Ti whenever it is announced. I've been hearing something about a firmware update needed for this monitor? What's that about, and does it affect the Acer X27 as well? Also is anybody using this monitor for HDR console gaming? If so how does it fare?


Yes. There will be an firmware update that they will release it (Windows app) until the end of the year. You can also send the monitor on Asus for free to do it now. 

It has some fixes like blackcrush on 144hz. Accoarding to tftcentral they measered about 500 nits, there is no way hours have So much, So it seems they fix also that. 

Sorry for my bad english


----------



## profundido

kx11 said:


> this flickering is so bad i hate it , good thing it doesn't appear in games
> 
> 
> 
> https://www.youtube.com/watch?v=ZPcisNbIRPM


I recognize that problem since I had it too on my X27: wrong fald mode set in osd


----------



## kot0005

Raytracing + HDR on this monitor...I cannot wait...


----------



## ToTheSun!

kot0005 said:


> Raytracing + HDR on this monitor...I cannot wait...


Ray traced HDR in BF V should be the absolute best experience anyone can have in gaming right now.


----------



## boredgunner

ToTheSun! said:


> Ray traced HDR in BF V should be the absolute best experience anyone can have in gaming right now.


BF5 will probably be the best looking game technically (easily) but meh... I am more visually wowed by incredible art design. Can't wait to play Metro Exodus, though I won't have this monitor since I'm waiting for a 32" counterpart with blur reduction (and more powerful GPUs lol). I can't use an LCD without blur reduction anymore.


----------



## stefxyz

Its for 1799 chf at Digitec in Switzerland right now. Just ordered mine.


----------



## Fanu

ToTheSun! said:


> Ray traced HDR in BF V should be the absolute best experience anyone can have in gaming right now.


you are in for a rude awakening if this video is anything to go by:

www.pcgameshardware.de/Grafikkarten...ormance-in-Shadow-of-the-Tomb-Raider-1263244/

German PC magazin test Tomb Raider with RTX 2080 Ti at Full HD with Raytracing on 30-60FPS

ray tracing seems to be a massive resource hog and a gimmick if this is the level of performance players can expect

more info on BFV and ray tracing:

https://www.pcgamesn.com/nvidia-rtx-2080-ti-hands-on

also poor performance.


----------



## sblantipodi

kot0005 said:


> Raytracing + HDR on this monitor...I cannot wait...


raytracing + 4k 100+Hz monitor?
completely unuseful since no GPUs will be able of that, neither in multi gpu.

hope to see 4K 60Hz HDR soon.


----------



## CallsignVega

Ya I'm not expecting to be able to use RTX on the worlds most demanding gaming monitor for a few GPU generations.


----------



## ToTheSun!

Fanu said:


> you are in for a rude awakening if this video is anything to go by


My comment was regarding the image quality, not the performance. Whether or not RTX Ultra will be playable enough for most people or not after the cards are running on proper drivers and efficient in-game implementations of the tech is a story to be told after Sept 20.


----------



## Fanu

ToTheSun! said:


> My comment was regarding the image quality, not the performance. Whether or not RTX Ultra will be playable enough for most people or not after the cards are running on proper drivers and efficient in-game implementations of the tech is a story to be told after Sept 20.


no, you wrote ray tracing should give players best experience anyone can have in gaming right now

yeah, if by "experience" you mean being content with watching a slideshow

and I dont see how proper drivers and game optimizations will raise those numbers from [email protected] to [email protected] - thats an insane performance jump to be gained on optimizations alone 
that would require entirely new hardware (much more powerful then these new cards have..)

I expected nvidia to at least show these cards running the new 4K 144Hz HDR monitors effortlessly or their new BFG HDR displays - alas they didnt show that and its probably cause there isnt that much of a performance difference compared to pascal cards (still unable to achieve 100+fps in 4K+HDR)


----------



## ToTheSun!

Fanu said:


> no, you wrote ray tracing should give players best experience anyone can have in gaming right now
> 
> yeah, if by "experience" you mean being content with watching a slideshow


Because my wording was ambiguous in my initial post, I explained what I meant more clearly. Feel free to put words in my mouth and debate conjecture, though.


----------



## kot0005

dammit, so much for RTX Hype... Tomb Raider is only running at 30-45fps on a 2080ti with RTX and this is at 1080p resolution lol.

https://twitter.com/VideoCardz/status/1031840054606548993


----------



## kot0005

CallsignVega said:


> Ya I'm not expecting to be able to use RTX on the worlds most demanding gaming monitor for a few GPU generations.


 I wonder if it will atleast be faster than a Titan V for raster rendering so I can get 100Fps at 4k ?!?

On the stage Jensen said that the card can do 80fps with DLAA at 4k while 1080ti can only do 30fps with DLAA


----------



## bee144

kot0005 said:


> dammit, so much for RTX Hype... Tomb Raider is only running at 30-45fps on a 2080ti with RTX and this is at 1080p resolution lol.
> 
> https://twitter.com/VideoCardz/status/1031840054606548993


so much for 4k Ray Tracing. And we all know SLI is trash.


----------



## bmgjet

Everything im seeing is.
2080 = 1080ti on DX11/10
2080ti = just below TitanV on DX11/10
While the tensor cores just sit there chewing a bit of a power which would explain the TDP being higher with less cuda cores and lower voltage.

Then that little bit of a leaked overclocking preview. 2080 maxed out at 1700mhz boost. And with Overclock he got it to 1886mhz, Scoring less then a 1080ti @ 2ghz in 3dmark.
He did mention there is no voltage control or memory overclocking yet and that might make quite a difference.
Video has been removed from youtube tho.


----------



## kx11

kot0005 said:


> dammit, so much for RTX Hype... Tomb Raider is only running at 30-45fps on a 2080ti with RTX and this is at 1080p resolution lol.
> 
> https://twitter.com/VideoCardz/status/1031840054606548993





the game is still not ready


----------



## Glerox

bmgjet said:


> Everything im seeing is.
> 2080 = 1080ti on DX11/10
> 2080ti = just below TitanV on DX11/10
> While the tensor cores just sit there chewing a bit of a power which would explain the TDP being higher with less cuda cores and lower voltage.
> 
> Then that little bit of a leaked overclocking preview. 2080 maxed out at 1700mhz boost. And with Overclock he got it to 1886mhz, Scoring less then a 1080ti @ 2ghz in 3dmark.
> He did mention there is no voltage control or memory overclocking yet and that might make quite a difference.
> Video has been removed from youtube tho.


The way I see it, it's a Titan V for 1200 USD instead of 3000 USD
Don't care (yet) for Nvidia gameworks crapshow. Just give us single card 4k>100 fps!


----------



## bmgjet

Glerox said:


> The way I see it, it's a Titan V for 1200 USD instead of 3000 USD
> Don't care (yet) for Nvidia gameworks crapshow. Just give us single card 4k>100 fps!



Yeah thats what im waiting for as well. Was hoping these new cards would be enough of a upgrade to replace the 1080ti sli. But looks like ill be waiting 1 more year for 7nm.


----------



## deadchip12

An off-topic comment on oled: Finally back to my home country and watch oled c7 for the very first time. Predictably impressed with the black level. The screen is like off. No burn in or banding so far so that's good. Can't wait to compare the hdr on this vs PG27UQ when I get my hands on it.

However I feel like the black is somehow crushed in movies. Watching Dunkirk for example; hard to make out details on the general's suit. Or Luke Cage season 2 episode 1; things overall seem too dark. Misty's hair for example; it's just a big black blob with no texture. Is that how things are supposed to look? I also try lagom test and cannot distinguish the first and most of the second row of squares

The flash from gunfire is freaking blinding though holy ****. That is nuts. Subtitles are also too bright. Probably because the screen is completely dark so creates such big contrast.


----------



## profundido

deadchip12 said:


> An off-topic comment on oled: Finally back to my home country and watch oled c7 for the very first time. Predictably impressed with the black level. The screen is like off. No burn in or banding so far so that's good. Can't wait to compare the hdr on this vs PG27UQ when I get my hands on it.
> 
> However I feel like the black is somehow crushed in movies. Watching Dunkirk for example; hard to make out details on the general's suit. Or Luke Cage season 2 episode 1; things overall seem too dark. Misty's hair for example; it's just a big black blob with no texture. Is that how things are supposed to look? I also try lagom test and cannot distinguish the first and most of the second row of squares
> 
> The flash from gunfire is freaking blinding though holy ****. That is nuts. Subtitles are also too bright. Probably because the screen is completely dark so creates such big contrast.


after you tune the basics in the monitor's OSD You should be able to see difference on all tiles in the Lagom test from first till last. I clearly see it on my X27. You'll notice that as soon as you set too much or too low gamma either the first row or the last becomes indistinguishable indicating your settings are wrong. One thing to note is that on these PC monitors obtaining the right setting requires a certain minimum brightness level while on an OLED such as the C7 it doesn't matter because of oled's very nature.


----------



## deadchip12

profundido said:


> deadchip12 said:
> 
> 
> 
> An off-topic comment on oled: Finally back to my home country and watch oled c7 for the very first time. Predictably impressed with the black level. The screen is like off. No burn in or banding so far so that's good. Can't wait to compare the hdr on this vs PG27UQ when I get my hands on it.
> 
> However I feel like the black is somehow crushed in movies. Watching Dunkirk for example; hard to make out details on the general's suit. Or Luke Cage season 2 episode 1; things overall seem too dark. Misty's hair for example; it's just a big black blob with no texture. Is that how things are supposed to look? I also try lagom test and cannot distinguish the first and most of the second row of squares
> 
> The flash from gunfire is freaking blinding though holy ****. That is nuts. Subtitles are also too bright. Probably because the screen is completely dark so creates such big contrast.
> 
> 
> 
> after you tune the basics in the monitor's OSD You should be able to see difference on all tiles in the Lagom test from first till last. I clearly see it on my X27. You'll notice that as soon as you set too much or too low gamma either the first row or the last becomes indistinguishable indicating your settings are wrong. One thing to note is that on these PC monitors obtaining the right setting requires a certain minimum brightness level while on an OLED such as the C7 it doesn't matter because of oled's very nature.
Click to expand...

On the oled even after tinkering with the settings like gamma, brightness, contrast etc. I still cannot see the first 2 rows properly

Edit: actually when I lower gamma from 2.2 to 1.9, only the first 3 squares in the first row are undistinguishable, the rest look ok. However, I cannot change gamma in hdr mode, so black may still be crushed.


----------



## profundido

deadchip12 said:


> On the oled even after tinkering with the settings like gamma, brightness, contrast etc. I still cannot see the first 2 rows properly
> 
> Edit: actually when I lower gamma from 2.2 to 1.9, only the first 3 squares in the first row are undistinguishable, the rest look ok. However, I cannot change gamma in hdr mode, so black may still be crushed.


That's exactly the sort of settings you need to tweak first time. You're getting closer but on such a good oled as yours I believe you should be able to get even the first 3 to become distinguishable. Ideally (if the monitor/tv supports it) you can enable HDR and get it perfect just by changing settings in the OSD. If your monitor/tv does not support that level of finetuning you can optionally decide to leave HDR mode off in windows and additonally tweak your windows and nvidia driver settings (nvidia control panel) on the pc until you get it perfect.


----------



## deadchip12

Anyone watched Dunkirk on the PG27UQ? Somehow when I watch it in hdr on the Oled, things that are supposed to be blindingly bright like the sun or explosions seem not any brighter than sdr. I watched Luke Cage and was shocked by how bright the flashes from the guns are).


----------



## mmms

deadchip12 said:


> Anyone watched Dunkirk on the PG27UQ? Somehow when I watch it in hdr on the Oled, things that are supposed to be blindingly bright like the sun or explosions seem not any brighter than sdr. I watched Luke Cage and was shocked by how bright the flashes from the guns are).


Do u mean u prefer PG27UQ on OLED for HDR content ?


----------



## profundido

deadchip12 said:


> Anyone watched Dunkirk on the PG27UQ? Somehow when I watch it in hdr on the Oled, things that are supposed to be blindingly bright like the sun or explosions seem not any brighter than sdr. I watched Luke Cage and was shocked by how bright the flashes from the guns are).


Are you sure you meet the proper requirements for playing that content truly in HDR ? If it is netflix You used for Lukas Cage not only HDR must be enabled but also the HEVC extensions from the Microsoft Windows store must be downloaded and installed. Same for Youtube HDR content through Edge or Firefox

As for Dunkirk you did not mention what software you used to play it and whether all prerequisites were met. It sounds like it just really played in SDR


----------



## dboythagr8

Can someone explain the Chrome 4:2:2 issues? 

The way I understand it is I cannot play at 144hz + HDR with 4:4:4 correct? Does that apply to 120hz as well? I'm reading that 98hz allows you to experience 4:4:4 and HDR, but of course you're reduced to 98hz. What about standard, non HDR gaming? Looking for some clarification from monitor owners.


----------



## kot0005

Nice even the 2080is doing 60+fps at 4k https://blogs.nvidia.com/blog/2018/08/22/geforce-rtx-60-fps-4k-hdr-games/?linkId=100000003301637

Def upgrading to 2080Ti


----------



## deadchip12

mmms said:


> deadchip12 said:
> 
> 
> 
> Anyone watched Dunkirk on the PG27UQ? Somehow when I watch it in hdr on the Oled, things that are supposed to be blindingly bright like the sun or explosions seem not any brighter than sdr. I watched Luke Cage and was shocked by how bright the flashes from the guns are).
> 
> 
> 
> Do u mean u prefer PG27UQ on OLED for HDR content ?
Click to expand...

No I haven't had one to test yet. Just asking those who already owned the monitor. Since the monitor is capable of higher peak brightness maybe the brighter stuffs are more impressive


----------



## deadchip12

profundido said:


> deadchip12 said:
> 
> 
> 
> Anyone watched Dunkirk on the PG27UQ? Somehow when I watch it in hdr on the Oled, things that are supposed to be blindingly bright like the sun or explosions seem not any brighter than sdr. I watched Luke Cage and was shocked by how bright the flashes from the guns are).
> 
> 
> 
> Are you sure you meet the proper requirements for playing that content truly in HDR ? If it is netflix You used for Lukas Cage not only HDR must be enabled but also the HEVC extensions from the Microsoft Windows store must be downloaded and installed. Same for Youtube HDR content through Edge or Firefox
> 
> As for Dunkirk you did not mention what software you used to play it and whether all prerequisites were met. It sounds like it just really played in SDR
Click to expand...

The HDR logo popped up on the oled c7 when Dunkirk or Luke Cage were played so I think HDR was working fine. Luke Cage is via Netflix while Dunkirk is via Bluray.


----------



## Fanu

kot0005 said:


> Nice even the 2080is doing 60+fps at 4k https://blogs.nvidia.com/blog/2018/08/22/geforce-rtx-60-fps-4k-hdr-games/?linkId=100000003301637
> 
> Def upgrading to 2080Ti


wait for 3rd party benchmarks

or dont, its not like you have a choice if you want a faster card regardless of price/perf value..


----------



## Vegtro

Anyone knows if I were to purchase this monitor presently that it'll have the updated firmware?


----------



## profundido

Vegtro said:


> Anyone knows if I were to purchase this monitor presently that it'll have the updated firmware?


all of the monitors that are being sold now should have the new firmware. There was a even an active STOP period (no monitors in stock allowed to be sold by vendors) right at release time to have all monitors in stock in shops updated firmware-wise and that was over a month ago


----------



## Astreon

kot0005 said:


> Nice even the 2080is doing 60+fps at 4k https://blogs.nvidia.com/blog/2018/08/22/geforce-rtx-60-fps-4k-hdr-games/?linkId=100000003301637
> 
> Def upgrading to 2080Ti


Upgrading before you even know what settings were those? (4K low settings isn't that much of a problem for 1080Ti already).


----------



## kot0005

Astreon said:


> Upgrading before you even know what settings were those? (4K low settings isn't that much of a problem for 1080Ti already).




preordered anyway lol. Its soldout now, literally everywhere..
You dont get charged until it ships, so no hard in preorder. Also my Shipping estimated is 1st november. plenty of time for benchmarks. Its going to be an issue for people who placed preorders for the 20th sept batches.


----------



## MiniZaid

I just see that display HDR 1000 has Acer x27 too
https://displayhdr.org/certified-products/

too bad. I think I would have gone with that one if they priced it properly for Canada. the X27 was $200 CDN more expensive than the Asus PG27UQ.
But now newegg Canada has it for the same price.

Although it appears that Asus one uses a more quality fan. But I hate the stand on the PG27UQ. The triangle formation is just lame. It takes too much space on the desk and i can't push the monitor towards the end of the desk closer to the wall.


----------



## stefxyz

I just love this monitor so much. Best impulsive buy ever. BF1 in 4k HDR on this monitor is the best visual wise I have ever seen and it runs north of 65 fps on ultra all the time on watercooled titan xp at 2000 mhz.

Also the balcks even in sdr are a huge step up and very respectable for an ips lcd. I dont like the fan noise at all on my near silent extreme watercooled setup but i happpily ignore this fault based on these extremely satisfying visuals.

Do u guys know if witcher 3 shows hdr now? I think on consoles it does.


----------



## Ford8484

stefxyz said:


> I just love this monitor so much. Best impulsive buy ever. BF1 in 4k HDR on this monitor is the best visual wise I have ever seen and it runs north of 65 fps on ultra all the time on watercooled titan xp at 2000 mhz.
> 
> Also the balcks even in sdr are a huge step up and very respectable for an ips lcd. I dont like the fan noise at all on my near silent extreme watercooled setup but i happpily ignore this fault based on these extremely satisfying visuals.
> 
> Do u guys know if witcher 3 shows hdr now? I think on consoles it does.


Naa, not yet. Hopefully eventually, but I think they have all their focus on Cyberpunk (for good reason).


----------



## sblantipodi

it seems that I called it.
4K, GSYNC, 144Hz, HDR400, NO FALD at a whopping 1750USD with taxes.

what a idiot market.

https://www.anandtech.com/show/13294/acer-unveils-predator-xb273k-4kp144-displayhdr-400-gaming-lcd


----------



## KGPrime

^ It says $1299.00 US. Still, it's laughable.


----------



## MistaSparkul

So where are all the people who were so confident we would see a Freesync equivalent X27 for half the price? The Non FALD Freesync version is already $900 so if there IS a FALD Freesync version coming, it would most definitely NOT cost $1000.


----------



## deadchip12

sblantipodi said:


> it seems that I called it.
> 4K, GSYNC, 144Hz, HDR400, NO FALD at a whopping 1750USD with taxes.
> 
> what a idiot market.
> 
> https://www.anandtech.com/show/13294/acer-unveils-predator-xb273k-4kp144-displayhdr-400-gaming-lcd


I just want a 60Hz VA equivalent of the PG27UQ. >100fps is way too hard to achieve now with more graphical demanding stuffs like ray tracing, and IPS causes too much blooming


----------



## MiniZaid

so for HDR, FALD is pretty important right? especially for dark scenes with some lighting for example
The new monitor support HDR400 but no fald means colour bleed.

and how do i turn FALD off on the PG27UQ? Just want to see the difference


----------



## Malinkadink

MistaSparkul said:


> So where are all the people who were so confident we would see a Freesync equivalent X27 for half the price? The Non FALD Freesync version is already $900 so if there IS a FALD Freesync version coming, it would most definitely NOT cost $1000.


I'd say i'm one of those people, and i think $900 for this monitor is pretty good, i didn't expect it to have FALD either. Apparently the Gsync module for 4k 144hz costs $500. That would mean it costs an additional $700 for FALD, or is it $500 if you look at Microcenter selling these monitors for $1800. 


I wonder what the profit margins are like on these monitors. 25%? 50%? Lets take $2k MSRP and 25% profit margin, then it takes them $1500 to make the FALD version with gsync. Chop off gsync and it now costs $1000 to make a FALD one with freesync. 25% profit margin on 1k would be $250 so they'd likely sell it for $1250-1300. 

A $1300 FALD Freesync version would be really nice, but to be honest i'd still probably just buy the $900 one, just wish AMD would release a good gpu for that kind of monitor already. 

Hoping Microcenter chops off an additional $100 off the $900 one. $799.99 for 4k 144hz + Freesync even with the backlight bleed/glow would be nice to sit on until microLED.


----------



## MistaSparkul

Malinkadink said:


> MistaSparkul said:
> 
> 
> 
> So where are all the people who were so confident we would see a Freesync equivalent X27 for half the price? The Non FALD Freesync version is already $900 so if there IS a FALD Freesync version coming, it would most definitely NOT cost $1000.
> 
> 
> 
> I'd say i'm one of those people, and i think $900 for this monitor is pretty good, i didn't expect it to have FALD either. Apparently the Gsync module for 4k 144hz costs $500. That would mean it costs an additional $700 for FALD, or is it $500 if you look at Microcenter selling these monitors for $1800.
> 
> 
> I wonder what the profit margins are like on these monitors. 25%? 50%? Lets take $2k MSRP and 25% profit margin, then it takes them $1500 to make the FALD version with gsync. Chop off gsync and it now costs $1000 to make a FALD one with freesync. 25% profit margin on 1k would be $250 so they'd likely sell it for $1250-1300.
> 
> A $1300 FALD Freesync version would be really nice, but to be honest i'd still probably just buy the $900 one, just wish AMD would release a good gpu for that kind of monitor already.
> 
> Hoping Microcenter chops off an additional $100 off the $900 one. $799.99 for 4k 144hz + Freesync even with the backlight bleed/glow would be nice to sit on until microLED.
Click to expand...

Honestly I feel like AMD has thrown in the towel when it comes to the high end. Seems like we're all better off putting our hopes in Intel instead since they've got that money for RND.


----------



## sblantipodi

deadchip12 said:


> I just want a 60Hz VA equivalent of the PG27UQ. >100fps is way too hard to achieve now with more graphical demanding stuffs like ray tracing, and IPS causes too much blooming


same here at 1000€ possibly since that price could be more than reasonable for a monitor like that


----------



## saltedham

so has anyone elses monitor messed up? when i turned it on today the entire screen was flashing various colors like red, white, green. then this popped up. 

now whenever i press the nub/joystick in to bring up the menu it brings up that service menu that i have to pick exit to get to the regular monitor options menu. 

**** my ass why did i get this monitor.


----------



## CallsignVega




----------



## sblantipodi

CallsignVega said:


>


2500€, in Italy we say: "grazie al cazzo"


----------



## profundido

sblantipodi said:


> 2500€, in Italy we say: "grazie al cazzo"


my translator says: "thanks to the dick" ??


----------



## stefxyz

AC Origins hates me. I dont get HDR working no matter what I do. BF1, BAttlefront, Farcry5, no issues.


----------



## kx11

last time i tried it the game used to enable HDR without enabling it from the desktop


----------



## Vegtro

What's up with the stock issue with this monitor in the US? The only place that has them in stock is Microcenter. Amazon is from a third party and NewEgg was back ordered and then OOS.


----------



## CallsignVega

They are selling well...


----------



## toncij

profundido said:


> my translator says: "thanks to the dick" ??


"Thanks to a c*ck" is an idiom that can't be directly translated. In English it would be "It's about f*cking time".


----------



## kx11

Vegtro said:


> What's up with the stock issue with this monitor in the US? The only place that has them in stock is Microcenter. Amazon is from a third party and NewEgg was back ordered and then OOS.





i want someone to explain this to me


----------



## toncij

kx11 said:


> i want someone to explain this to me


Explaint what?


----------



## JackCY

Vegtro said:


> What's up with the stock issue with this monitor in the US? The only place that has them in stock is Microcenter. Amazon is from a third party and NewEgg was back ordered and then OOS.


They make only so few hence the ridiculous price and low stock. Stock in eshops is useless, it's often distributor stock anyway and shops don't have units in their warehouse, distributor does. Order with expected delivery date, don't care about immediate stock.

They sell well? Not really, unless you mean that if they offer 10 units and 10 units sold means selling well LOL. It's a far cry from the volumes and profits of other monitors.

3 units stock here from the biggest in country shop, no stock issues. Check it in two weeks and those 3 units are probably going to be still sitting there.

Even PG279Q is outside top50 sold monitors here. Let alone PG27UQ.


----------



## kx11

toncij said:


> Explaint what?





Asus pg27uq is 2400$
Acer X27 is 1999$


same panel AFAIK


----------



## toncij

kx11 said:


> Asus pg27uq is 2400$
> Acer X27 is 1999$
> 
> 
> same panel AFAIK


Wasn't aware of that it's different price. Thanks!


----------



## Aristotelian

sblantipodi said:


> 2500€, in Italy we say: "grazie al cazzo"


For those wondering, it essentially means "thanks for nothing". And this poster meant that for EUR 2500 it is way too expensive to be praised.

Either way I did read the prad review where it gets an overall 4.1/5. 

Still on the fence about this. An upcoming move...and I'll probably try to see one of these in person first...


----------



## skingun

Oops. My finger slipped on the buy button.


----------



## bwana

I saw the ASUS at microcenter in Cambridge,MA. It was being driven by a pair of 1080Tis in SLi. In HDR mode it was nice. Not as crisp as an IPS monitor. Hard to believe it was 4K. Maybe the matte screen was dropping the contrast. Plus the store is bright and who knows if the display was calibrated (Actually, I dont think there is a calibration standard for HDR displays) But when you set it to 144 hz, it turns off HDR mode (takes 30s to make the switch) and it looks terrible. Prob a driver issue not adjusting for the decreased hardware bit depth. In terms of tearing, stutter, etc there was none but I didnt test it with a game, just flicking a window around.


----------



## kx11

can you guys get audio through HDMI ? playing PS4 with this monitor and audio isn't passing through


----------



## Malinkadink

kx11 said:


> can you guys get audio through HDMI ? playing PS4 with this monitor and audio isn't passing through


Make sure volume is turned up in OSD.


----------



## kx11

Malinkadink said:


> Make sure volume is turned up in OSD.





it's set t 85


----------



## Malinkadink

kx11 said:


> it's set t 85


What are you trying to pass through to exactly? headphones? Something else?


----------



## kx11

Malinkadink said:


> What are you trying to pass through to exactly? headphones? Something else?



NVM i just remembered it's got a headphone jack


----------



## skingun

Had my screen for 2 days. It's brilliant...

...apart from the fan. But with headphones on I don't notice it.

Guess that FPGA is working hard!


----------



## toncij

skingun said:


> Had my screen for 2 days. It's brilliant...
> 
> ...apart from the fan. But with headphones on I don't notice it.
> 
> Guess that FPGA is working hard!


How do you like the mode switching speed?
Can you compare the 98Hz with 144Hz of normal-resolution screens of other type?
How is the scaler? Can you run 1440 or 1080 scaled up and is it too blurred? Does it have a scaler at all? If not, have you tried driver scaling on the GPU?


----------



## Ford8484

toncij said:


> How do you like the mode switching speed?
> Can you compare the 98Hz with 144Hz of normal-resolution screens of other type?
> How is the scaler? Can you run 1440 or 1080 scaled up and is it too blurred? Does it have a scaler at all? If not, have you tried driver scaling on the GPU?


for Axer X27 (which is the same panel of course) scales really well for 1440p. Of course it depends on the games AA as well, but its really close to native 27 inch 1440p...slightly blurrier naturally. But the panel also has poppier colors and much better contast with the FALD- so in some ways that makes up for the loss in sharpness.


----------



## deadchip12

There are a total of 4 monitors imported into my country. The one near my place has scratches and dead pixels, so now I need to wait for around at least 4 more weeks for a new batch. Just absolutely disappointing man.


----------



## acmilangr

I still have the HDR issues on forza motorsport and other games. The colors are washed out. When the game starts it is fine for about 2 seconds then the colors become washed out. I have tried every combination on 422/444 and 10bit/8bit with no success.

I have also the latest nvidia drivers. What else to do?


----------



## toncij

acmilangr said:


> I still have the HDR issues on forza motorsport and other games. The colors are washed out. When the game starts it is fine for about 2 seconds then the colors become washed out. I have tried every combination on 422/444 and 10bit/8bit with no success.
> 
> I have also the latest nvidia drivers. What else to do?


Have you reduced refresh rate? Maybe it's keeping you locked at 4:2:0.


----------



## Glerox

Any news on when we should be able to update our PG27UQ firmware?


----------



## acmilangr

toncij said:


> Have you reduced refresh rate? Maybe it's keeping you locked at 4:2:0.


I tried with no luck.
Tge outpur dynamic rsnge is set to "limited" on every combination except of i choose outpur color format "RGB". Is this correct?


----------



## toncij

acmilangr said:


> I tried with no luck.
> Tge outpur dynamic rsnge is set to "limited" on every combination except of i choose outpur color format "RGB". Is this correct?


RGB is what you want, yes.


----------



## acmilangr

toncij said:


> RGB is what you want, yes.


Still the Same problem. 
Are you sure it works fine Fm7 on you? Have toy tried without hdr to See the difference on colors? For example on a Red car.


----------



## badjz

acmilangr said:


> toncij said:
> 
> 
> 
> RGB is what you want, yes.
> 
> 
> 
> Still the Same problem.
> Are you sure it works fine Fm7 on you? Have toy tried without hdr to See the difference on colors? For example on a Red car.
Click to expand...

Their is nothing wrong with your monitor, this is HDR implementation in windows. Download vibrancegui, add game executables & increase vibrancy to 65. For games like forza that require HDR to be set on in windows firstly, you need to be running in 144hz & set the digital vibrancy in nvidia control panel.

Enjoy bursting colours now mate


----------



## acmilangr

badjz said:


> Their is nothing wrong with your monitor, this is HDR implementation in windows. Download vibrancegui, add game executables & increase vibrancy to 65. For games like forza that require HDR to be set on in windows firstly, you need to be running in 144hz & set the digital vibrancy in nvidia control panel.
> 
> Enjoy bursting colours now mate


Thanks for the answer. 
FM7 is Hard to find the exe file! You cannot even have access to his folder! Microsoft doesnt let you...


----------



## badjz

acmilangr said:


> badjz said:
> 
> 
> 
> Their is nothing wrong with your monitor, this is HDR implementation in windows. Download vibrancegui, add game executables & increase vibrancy to 65. For games like forza that require HDR to be set on in windows firstly, you need to be running in 144hz & set the digital vibrancy in nvidia control panel.
> 
> Enjoy bursting colours now mate
> 
> 
> 
> Thanks for the answer.
> FM7 is Hard to find the exe file! You cannot even have access to his folder! Microsoft doesnt let you...
Click to expand...

Yeah for windows store games u need to set DV in the nvidia control panel, but you also need to be at 144hz otherwise it will revert back to washed out colours for some reason once the game loads.


----------



## Glerox

I guess we still have to wait for the updated firmware!

Also, I tried to reduce the resolution to 2560x1440 to test if I can get [email protected]@RGB FULL but for a reason the monitor always detects a 4K signal in the OSD.
Has anybody tried this and know if it's possible to lower the resolution to get 144Hz without chroma subsampling?


----------



## animeowns

Glerox said:


> Any news on when we should be able to update our PG27UQ firmware?


keep on track on the asus service webpage for the pg27uq I am asking asus in chat now can they add an option for that in the next firmware update gelrox

https://www.asus.com/us/support/FAQ/1036750


----------



## Glerox

animeowns said:


> keep on track on the asus service webpage for the pg27uq I am asking asus in chat now can they add an option for that in the next firmware update gelrox
> 
> https://www.asus.com/us/support/FAQ/1036750


thanks!


----------



## dboythagr8

I just got this monitor. A few questions. I thought this was the situation with chroma

98Hz, RGB (4:4:4), 10-bit color depth
120Hz, RGB (4:4:4), 8-bit color depth
144Hz, YCbCr 4:2:2, 8-bit color depth

And here are the supported modes and settings for HDR content:
98Hz, RGB (4:4:4), 10-bit color depth
120Hz, YCbCr 4:2:2, 10-bit color depth
144Hz, YCbCr 4:2:2, 10-bit color depth


I'm in SDR now. I've set my monitor to 98hz in display adapter properties, and it shows this










How do I get 10-bit color depth to show up? Also, I bought mine in store at Microcenter. How do I know if I have the new firmware or not?


----------



## dboythagr8

badjz said:


> Yeah for windows store games u need to set DV in the nvidia control panel, but you also need to be at 144hz otherwise it will revert back to washed out colours for some reason once the game loads.


Wait, for Windows Store games, you have to run them in 144hz HDR? So you have to put up with YCbCR 4:2:2?


----------



## badjz

animeowns said:


> Glerox said:
> 
> 
> 
> Any news on when we should be able to update our PG27UQ firmware?
> 
> 
> 
> keep on track on the asus service webpage for the pg27uq I am asking asus in chat now can they add an option for that in the next firmware update gelrox
> 
> https://www.asus.com/us/support/FAQ/1036750
Click to expand...

Please keep us posted on what they say, this is getting quite comedic that we still don’t have a fully functioning monitor.


----------



## dboythagr8

Does this support HDR on consoles via HDMI? It appears not? My One X shows that it can display 4k fine on the monitor, but not HDR. And I'm using the same HDR cable that I use for the system on my OLED...


----------



## Vegtro

dboythagr8 said:


> Does this support HDR on consoles via HDMI? It appears not? My One X shows that it can display 4k fine on the monitor, but not HDR. And I'm using the same HDR cable that I use for the system on my OLED...


Is there an option to turn on HDR HDMI mode on the monitor like the TVs?


----------



## acmilangr

badjz said:


> Yeah for windows store games u need to set DV in the nvidia control panel, but you also need to be at 144hz otherwise it will revert back to washed out colours for some reason once the game loads.


What is DV?


----------



## profundido

acmilangr said:


> What is DV?


I think he refers to Digital Vibrance


----------



## acmilangr

profundido said:


> I think he refers to Digital Vibrance


Yes. I think also now. 
But how to set it o. Forza motorsport,i cant load the exe file


----------



## profundido

acmilangr said:


> Yes. I think also now.
> But how to set it o. Forza motorsport,i cant load the exe file



you just set digital vibrance in nvidia control panel to adjust the color intensity and it automatically affects windows ui as well as all games that run on top of it as long as HDR mode is not enabled in windows. I can't imagine what you mean by exe loading or the relevance to this setting


----------



## acmilangr

profundido said:


> you just set digital vibrance in nvidia control panel to adjust the color intensity and it automatically affects windows ui as well as all games that run on top of it as long as HDR mode is not enabled in windows. I can't imagine what you mean by exe loading or the relevance to this setting


Sorry but i didnt understand you


----------



## profundido

acmilangr said:


> Sorry but i didnt understand you


to clarify: you configure digital vibrance in Nvidia control panel. This single action automatically sets your desired color intensity for the windows user interface (= windows itself) as well as your games.


----------



## dboythagr8

profundido said:


> you just set digital vibrance in nvidia control panel to adjust the color intensity and it automatically affects windows ui as well as all games that run on top of it as long as HDR mode is not enabled in windows. I can't imagine what you mean by exe loading or the relevance to this setting


I am confused again....

I get the DV in NVCP part, but then you say it automatically effects games and windows ui as long as HDR mode is not enabled within Windows itself. Well...you have to enable HDR in Windows to get MS games to work. I just tried it last night with the Horizon demo. So going by the above, if you set HDR in Windows, then DV doesn't work? And then there's the whole 144hz deal with MS Store titles? Could you guys clarify some of this? Jumping into this thread and this and that, is a bit overwhelming as a new owner who wants to get the most out of this expensive monitor.



Vegtro said:


> Is there an option to turn on HDR HDMI mode on the monitor like the TVs?


Not from what I can see. I hope that some existing owners can chime in one some of my own questions and questions like yours.....


----------



## acmilangr

Increasing the DV from nvidia panel actually seems to work. But i dont like to do something that it is not 100% correct. Anyway

Also i had a strange thing on FM7. gsync seems not to work fine. I had some small stuterins(i have about 100-120fps on the game).
The solution was to enable v-sync on the Fm7 options... That is weird.. So it needs gsync and Vsync to be Smooth


----------



## profundido

dboythagr8 said:


> I am confused again....
> 
> I get the DV in NVCP part, but then you say it automatically effects games and windows ui as long as HDR mode is not enabled within Windows itself. Well...you have to enable HDR in Windows to get MS games to work. I just tried it last night with the Horizon demo. So going by the above, if you set HDR in Windows, then DV doesn't work? And then there's the whole 144hz deal with MS Store titles? Could you guys clarify some of this? Jumping into this thread and this and that, is a bit overwhelming as a new owner who wants to get the most out of this expensive monitor.
> 
> 
> 
> Not from what I can see. I hope that some existing owners can chime in one some of my own questions and questions like yours.....


ok I just read all comments again from last 2 pages and I can see it's really hard to drop into this thread while having missed all previous information in this thread entirely. Make sure you use the search function to look up the specific parts you missed like using keyword "vibrancegui". Before you start searching the thread here's the recap overview like in the beginning of a new episode in a series:

1. Without HDR enabled in windows DV controls vibrancy in windows ui and games as it always has. Nothing new here
2. HDR is a brand new feature and has been terribly implemented by Microsoft in it's current form. It's 1 big mess and there's no 1 golden rule to get all games working properly but a combination of several.
3. Many games have a direct way to override windows settings when you enable HDR inside them and thus do not require HDR to be activated in windows
4. Some games need HDR activated in windows or do not work, such as for example those windows store games you tested
5. When HDR is active in windows it overrides your monitors OSD settings as well as the nvidia control panel SDR digital vibrancy sliders for colors and vibrancy, according to a 'standard' setting that microsoft 'enforces' towards your monitor, even when they end up turning "washed out" on this specific monitor while looking perfect on some other HDR monitor that has other specs.
6. Since for this specific monitor the HDR color settings microsoft enforces are off or "wrong" people here on the forum use a third party program called "Vibrancegui" to override those default settings for specific programs by specifying the program's .exe in vibrancegui and then specify their desired overriding vibrancy settings for that specific program. This way they can work around the limitations of windows and it's messy implementation of HDR.

I hope this brings everyone up to speed. For the details please use the thread search function or ask the people that made specific suggestions.


----------



## profundido

acmilangr said:


> Increasing the DV from nvidia panel actually seems to work. But i dont like to do something that it is not 100% correct. Anyway
> 
> Also i had a strange thing on FM7. gsync seems not to work fine. I had some small stuterins(i have about 100-120fps on the game).
> The solution was to enable v-sync on the Fm7 options... That is weird.. So it needs gsync and Vsync to be Smooth



In order to avoid stuttering in all games there is a golden standard rule to follow:

1. set vsync forced to "on" in nvidia control panel for all profiles
2. Enable "use gsync in fullscreen mode" in windows
3. Disable any vsync/gsync settings in-game or set them to "off"
4. Configure an fps limit in rivatuner 3 below the configured maximum of your monitor (117 for 120hz or 141 for 144hz in this case)
5. Benchmark/test and watch afterburner and cpu performance. If gpu performance reaches 100% too many times you'll experience stuttering and means the configured max fps limit is too high for your video card setup. Configure a lower one until your system can handle it
6. If cpu performance reaches 100% too much, consider giving "high" priority to the games .exe process
7. Enjoy buttersmooth gameplay without tearing/stuttering


----------



## acmilangr

Actually this 65% works very well on forza motorsport 7 and injustice 2.

Nest step assassin's creed origins.... It is very dark when hdr is enabled. How that works on yours?


----------



## badjz

acmilangr said:


> Actually this 65% works very well on forza motorsport 7 and injustice 2.
> 
> Nest step assassin's creed origins.... It is very dark when hdr is enabled. How that works on yours?


Yes the DV works magic compared to standard.

Assasins creed HDR is broken in my opinion, has never looked right to me.


----------



## dboythagr8

profundido said:


> ok I just read all comments again from last 2 pages and I can see it's really hard to drop into this thread while having missed all previous information in this thread entirely. Make sure you use the search function to look up the specific parts you missed like using keyword "vibrancegui". Before you start searching the thread here's the recap overview like in the beginning of a new episode in a series:
> 
> 1. Without HDR enabled in windows DV controls vibrancy in windows ui and games as it always has. Nothing new here
> 2. HDR is a brand new feature and has been terribly implemented by Microsoft in it's current form. It's 1 big mess and there's no 1 golden rule to get all games working properly but a combination of several.
> 3. Many games have a direct way to override windows settings when you enable HDR inside them and thus do not require HDR to be activated in windows
> 4. Some games need HDR activated in windows or do not work, such as for example those windows store games you tested
> 5. When HDR is active in windows it overrides your monitors OSD settings as well as the nvidia control panel SDR digital vibrancy sliders for colors and vibrancy, according to a 'standard' setting that microsoft 'enforces' towards your monitor, even when they end up turning "washed out" on this specific monitor while looking perfect on some other HDR monitor that has other specs.
> 6. Since for this specific monitor the HDR color settings microsoft enforces are off or "wrong" people here on the forum use a third party program called "Vibrancegui" to override those default settings for specific programs by specifying the program's .exe in vibrancegui and then specify their desired overriding vibrancy settings for that specific program. This way they can work around the limitations of windows and it's messy implementation of HDR.
> 
> I hope this brings everyone up to speed. For the details please use the thread search function or ask the people that made specific suggestions.



Thank you for taking the time to break this down. So on #4, the answer is to run the monitor in 144hz mode? I didn't understand that. Or to use the Vibrancegui program?


----------



## dboythagr8

Checked my serial number as I was gathering it for the Black Ops 4 promotion...looks like my monitor was manufactured in June 2018.


----------



## skingun

What Black Ops promotion?


----------



## toncij

acmilangr said:


> Still the Same problem.
> Are you sure it works fine Fm7 on you? Have toy tried without hdr to See the difference on colors? For example on a Red car.


Haven't tried on Fm7.  
Will try to test.


----------



## dboythagr8

skingun said:


> What Black Ops promotion?


https://promotion.asus.com/en/call-of-duty-black-ops-4-dominate-with-the-best


----------



## skingun

Damn. Bought my screen 2 September. I don't qualify 😞


----------



## bmgjet

skingun said:


> Damn. Bought my screen 2 September. I don't qualify 😞


I bought mine day 1, But still filled in all the info they needed, Picture of serial number and screen shot of quote (accidently cropped out the date) so lets see if they email me a code lol.


----------



## Glerox

Lol plz tell us if it's working, I will do the same!


----------



## sblantipodi

is there any news on the upcoming 4K, 144Hz, GSYNC, HDR400 monitor from Asus?
when will we see it?


----------



## dboythagr8

Really enjoying the monitor so far, although I'm using it more for general browsing and the PS4 at the moment. Waiting on my 2080Ti to come in before really seeing what it can do. I tried a few games with my 1060 at 4k and liked what I saw, but I'd rather go in full force with the 2080ti.


----------



## Glerox

dboythagr8 said:


> Really enjoying the monitor so far, although I'm using it more for general browsing and the PS4 at the moment. Waiting on my 2080Ti to come in before really seeing what it can do. I tried a few games with my 1060 at 4k and liked what I saw, but I'd rather go in full force with the 2080ti.


Yup this monitor screams for more POWAAA lol


----------



## Glerox

Jesus Christ, Shadow of the Tomb Raider on this monitor is so freaking beautiful. I just wander around looking at the graphics lol.


----------



## profundido

Glerox said:


> Jesus Christ, Shadow of the Tomb Raider on this monitor is so freaking beautiful. I just wander around looking at the graphics lol.


WHOAAAAH YESS !! I just came in here to post the exact same thing and it seems you beat me to it !

I bought it last friday after I saw a confirmation in a reddit post that SLI and dx12 mgpu has really good support in this game and I decided to use it to further tweak and perhaps finally be able to challenge my system using the game's built-in benchmark and...ended up binge-playing this game all weekend long !!

I just keep going back from zone to zone admiring the graphics but really...THIS is the future of gaming and we have seen it. No screenshots can do it justice. Finally a game with a proper HDR implementation (instead of just constant overbright white flashes in your face), steady 90-100fps @4K (2*TitanX Pascal). My god this is the 163ppi shining at it's best everytime you see those faces up close like when Lara is under water and comes above water briefly to gasp for air. Every single time I am mesmerized as it looks so realistic. I can no longer detect any sort of separate pixels or typical imperfections. I just can't stop watching the screen...

It's been a long time since I've been so excited about a computer gaming experience but this weekend I felt it again as my first computer game ! On top of that the story and game itself is amazingly well done as well. There's only 1 big minus here and it has to be said so let me call out the elephant in the room: After that experience all weekend long I now have a really really hard time going back to any other game...


----------



## acmilangr

Yeah sotb is amazing. Also gsync works fantastic. 55-65fps and zero stutering, zero tearing


----------



## tinykitten

It looks fantastic on this monitor.


I encountered a weird bug though I believe or I'm stupid and am missing something; everytime I open the map or menu or click on a camp my monitor goes black for a second and comes back with the HDR On overlay. It's somewhat annoying to be honest, not sure if I'm the only one with this problem. I'm using the latest Nvidia drivers, old PG27UQ firmware, 98hz 10bit. HDR on in Windows before I start the game; Exclusive Full Screen on or off didn't seem to make a difference. Any ideas or pointers on what I might be missing here? I'm assuming that this isn't intended.


----------



## profundido

tinykitten said:


> It looks fantastic on this monitor.
> 
> 
> I encountered a weird bug though I believe or I'm stupid and am missing something; everytime I open the map or menu or click on a camp my monitor goes black for a second and comes back with the HDR On overlay. It's somewhat annoying to be honest, not sure if I'm the only one with this problem. I'm using the latest Nvidia drivers, old PG27UQ firmware, 98hz 10bit. HDR on in Windows before I start the game; Exclusive Full Screen on or off didn't seem to make a difference. Any ideas or pointers on what I might be missing here? I'm assuming that this isn't intended.


I can only confirm that I have the same whenever I use TAB to go to the map. For the record I have the X27 so although the panel and base nvidia firmware is the same, the vendor-specific resulting firmware is a bit different but not much. You can clearly see that the game's graphics engine decides to switch to another modus which gets treated exactly like that by the monitor: DP input is being detected all over for a second (it's icon appears on screen) and then your signal input gets accepted. I didn't think much of that part even though it's annoying yes. I assume it's caused either by the game engine itself or by me running g-sync in SLI perhaps. Didn't bother to look into that. I was too busy recalibrating my watercooling fan curves to handle this 'new' sustained thermal load. I have monitor input fixed on DP instead of autoselect.

also note that when in non-exlusive fullscreen mode (=windowed) and you alt-tab to windows all resources freeze 100% as if the game is no longer running. That is quite amazing and every game should do this but it goes to show that most likely the very graphics engine of game itself stops sending signal when you switch modus such as going to map. Curious what other people find regarding this.


----------



## acmilangr

I See that there is this blinking on this monitor even if there is some notification for example on origin.


----------



## Glerox

tinykitten said:


> It looks fantastic on this monitor.
> 
> 
> I encountered a weird bug though I believe or I'm stupid and am missing something; everytime I open the map or menu or click on a camp my monitor goes black for a second and comes back with the HDR On overlay. It's somewhat annoying to be honest, not sure if I'm the only one with this problem. I'm using the latest Nvidia drivers, old PG27UQ firmware, 98hz 10bit. HDR on in Windows before I start the game; Exclusive Full Screen on or off didn't seem to make a difference. Any ideas or pointers on what I might be missing here? I'm assuming that this isn't intended.


I will try that and report.

Are you guys using the recommended HDR reference nits of 52 for the PG27UQ in the tftcentral review? (instead of the default 80)
Apparently it's mapping the intended brightness more accurately.

I'm using it and looks good so far.


----------



## acmilangr

Glerox said:


> I will try that and report.
> 
> Are you guys using the recommended HDR reference nits of 52 for the PG27UQ in the tftcentral review? (instead of the default 80)
> Apparently it's mapping the intended brightness more accurately.
> 
> I'm using it and looks good so far.


Dont forget that tftcentral had the latest firmware with alot differences than ours. So it is not sure if this settings is better for our monitor yet


----------



## Glerox

acmilangr said:


> Dont forget that tftcentral had the latest firmware with alot differences than ours. So it is not sure if this settings is better for our monitor yet


True. But still my feeling is that the default reference white of 80 is way too bright. We'll see if that changes with the firmware update.


----------



## dboythagr8

How can you verify what firmware your monitor is on?


----------



## Glerox

dboythagr8 said:


> How can you verify what firmware your monitor is on?


Look at this page in SDR mode at 144Hz :
http://www.lagom.nl/lcd-test/black.php

If you have the old firmware, you will have black crush (some squares will be all black)
If you have the new firmware, it will work as intended (all the squares will have a different grey tint, even the darker one)


----------



## axiumone

Is Destiny 2 HDR broken? Just got the display. Andromeda, SOTR, BF1 all look fine in [email protected] Destiny 2 is crushing the blacks massively. There's absolutely no detail in the picture. The calibration seems to be off, white balance works fine, but adjusting the black limit I can't seems to see the middle symbol even at the highest setting.


----------



## skingun

Strange. It works for me.


----------



## kx11

axiumone said:


> Is Destiny 2 HDR broken? Just got the display. Andromeda, SOTR, BF1 all look fine in [email protected] Destiny 2 is crushing the blacks massively. There's absolutely no detail in the picture. The calibration seems to be off, white balance works fine, but adjusting the black limit I can't seems to see the middle symbol even at the highest setting.





sometimes unplugging the DP cable fixes some problems


----------



## Glerox

tinykitten said:


> It looks fantastic on this monitor.
> 
> 
> I encountered a weird bug though I believe or I'm stupid and am missing something; everytime I open the map or menu or click on a camp my monitor goes black for a second and comes back with the HDR On overlay. It's somewhat annoying to be honest, not sure if I'm the only one with this problem. I'm using the latest Nvidia drivers, old PG27UQ firmware, 98hz 10bit. HDR on in Windows before I start the game; Exclusive Full Screen on or off didn't seem to make a difference. Any ideas or pointers on what I might be missing here? I'm assuming that this isn't intended.


Hey I don't know what I'm doing differently but I don't get the black screen when going in a camp or checking my map... it's all working well.


----------



## tinykitten

Glerox said:


> Hey I don't know what I'm doing differently but I don't get the black screen when going in a camp or checking my map... it's all working well.


I'm not sure if any OSD option could cause this. profundido has this issue also on a X27. Are you on the "new" firmware by chance? I'm not sure if this can be attributed to the firmware differences, who knows.


----------



## Vegtro

Just got this monitor and I have a question, how do I turn off the monitor so the fan turns off?


----------



## badjz

Any news on the firmware update? This is getting ridiculous....


----------



## badjz

Vegtro said:


> Just got this monitor and I have a question, how do I turn off the monitor so the fan turns off?


Fans will turn off automatically, give it time.


----------



## sblantipodi

badjz said:


> Any news on the firmware update? This is getting ridiculous....


what improvements the firmware upgrade should bring?


----------



## acmilangr

sblantipodi said:


> what improvements the firmware upgrade should bring?


Fix Blackcrush at 144hz and posible much more brightness at SDR mode


----------



## acmilangr

I have a New Asus pg27uq and i can confirm you it has the New firmware
-New option on osd about HDR logo (enable or disable) 
-Fixed Blackcrush at 144hz
-much more SDR brightness. It seems like to be 600nits as tftcentral reported
-better White, but maybe it is becouse of higher brightness 

Ask anything you like Free


----------



## Glerox

acmilangr said:


> I have a New Asus pg27uq and i can confirm you it has the New firmware
> -New option on osd about HDR logo (enable or disable)
> -Fixed Blackcrush at 144hz
> -much more SDR brightness. It seems like to be 600nits as tftcentral reported
> -better White, but maybe it is becouse of higher brightness
> 
> Ask anything you like Free


Niiiiiiice! Can't wait to update mine! ASUS GIMME DA FIRMWARE


----------



## acmilangr

Actually it is too bright now on SDR. I saw that the brightness was set as default at about 66. If I set it to 100 it is too bright to my eyes in the dark.


----------



## CallsignVega

I've never seen a monitor when set to full brightness not be too bright in the dark...


----------



## axiumone

acmilangr said:


> I have a New Asus pg27uq and i can confirm you it has the New firmware
> -New option on osd about HDR logo (enable or disable)
> -Fixed Blackcrush at 144hz
> -much more SDR brightness. It seems like to be 600nits as tftcentral reported
> -better White, but maybe it is becouse of higher brightness
> 
> Ask anything you like Free


Where is the new option for HDR logo in osd?


----------



## acmilangr

axiumone said:


> acmilangr said:
> 
> 
> 
> I have a New Asus pg27uq and i can confirm you it has the New firmware
> -New option on osd about HDR logo (enable or disable)
> -Fixed Blackcrush at 144hz
> -much more SDR brightness. It seems like to be 600nits as tftcentral reported
> -better White, but maybe it is becouse of higher brightness
> 
> Ask anything you like Free
> 
> 
> 
> Where is the new option for HDR logo in osd?
Click to expand...

There is "warning message HDR" on/off option

Also there is New option called "DP SDR Ycbr SRBG GAMMA"


----------



## acmilangr

CallsignVega said:


> I've never seen a monitor when set to full brightness not be too bright in the dark...


It is much more brighter now.


----------



## fleggy

Please, could anybody test the game MGS 5: The Phantom Pain on this monitor with the newer FW? Just choose Fast FALD, start the game, exit back to desktop and watch the wallpaper while moving mouse. I can see strange colored stains/areas. Asus support (EMEA) suggested me to RMA the monitor what is nonsense, I believe. I had a second unit and both units (1st FW release) have this bug. Thanks

BTW one must unplug the monitor to get rid of the bug. Just toggle Power Off/On is not enough.


----------



## axiumone

fleggy said:


> Please, could anybody test the game MGS 5: The Phantom Pain on this monitor with the newer FW? Just choose Fast FALD, start the game, exit back to desktop and watch the wallpaper while moving mouse. I can see strange colored stains/areas. Asus support (EMEA) suggested me to RMA the monitor what is nonsense, I believe. I had a second unit and both units (1st FW release) have this bug. Thanks
> 
> BTW one must unplug the monitor to get rid of the bug. Just toggle Power Off/On is not enough.


No issues on my end. New FW.


----------



## fleggy

axiumone said:


> No issues on my end. New FW.


Thank you, that's great 
Other settings on default? (SDR, sRGB, Race mode, etc...)


----------



## acmilangr

These days i am playing shadow of Tomb raider and i think there is no other game that HDR working as good as this. There is no scene that blind your eyes. The were take care on everything. I was on a cave staring to the exit and there was really bright outside (in about 15% of the scene) . When i was exiting from cave then the brightness was getting less to prevent blind my eyes. Really exiting. 

Sorry for my bad english


----------



## skingun

Playing lots of Destiny 2 on mine 😁


----------



## profundido

fleggy said:


> Please, could anybody test the game MGS 5: The Phantom Pain on this monitor with the newer FW? Just choose Fast FALD, start the game, exit back to desktop and watch the wallpaper while moving mouse. I can see strange colored stains/areas. Asus support (EMEA) suggested me to RMA the monitor what is nonsense, I believe. I had a second unit and both units (1st FW release) have this bug. Thanks
> 
> BTW one must unplug the monitor to get rid of the bug. Just toggle Power Off/On is not enough.


I've seen this phenomena on my X27 which still has the old firmware as well. the FALD gets trapped into a "faulty" state and after some troubleshooting I found that using the OSD to switch to another FALD mode and then back to the same one fixes the problem. You could see if that's the case for you too. That means the new firmware at the end of the year should fix it most likely.


----------



## kx11

skingun said:


> Playing lots of Destiny 2 on mine 😁



man it looks good on HDR 4k even though i notice some fps sudden drops , most likely HDR causing that


----------



## fleggy

profundido said:


> I've seen this phenomena on my X27 which still has the old firmware as well. the FALD gets trapped into a "faulty" state and after some troubleshooting I found that using the OSD to switch to another FALD mode and then back to the same one fixes the problem. You could see if that's the case for you too. That means the new firmware at the end of the year should fix it most likely.



I tried everything including this with no success. Thanks for the hint, anyway. I hope Asus will release its FW updater by the end of year.


----------



## profundido

tinykitten said:


> It looks fantastic on this monitor.
> 
> 
> I encountered a weird bug though I believe or I'm stupid and am missing something; everytime I open the map or menu or click on a camp my monitor goes black for a second and comes back with the HDR On overlay. It's somewhat annoying to be honest, not sure if I'm the only one with this problem. I'm using the latest Nvidia drivers, old PG27UQ firmware, 98hz 10bit. HDR on in Windows before I start the game; Exclusive Full Screen on or off didn't seem to make a difference. Any ideas or pointers on what I might be missing here? I'm assuming that this isn't intended.


Now that I finished the game I have made some time to troubleshoot this  Turns out HDR (most likely in combination with nvidia drivers and my current old X27 firmware) causes this. When I turn HDR off in windows and play the game this issue is completely nonexistent and my monitor also doesn't display that "DP input detected" mode-switching thing.

I suppose this too might get solved by the new firmware at the end of the year and/or new drivers.


----------



## kx11

anyone got shadow of the tomb raider ??


if yes please check out this issue 



set the display refresh to 98hz then enable HDR to get HDR10 , run the game and turn HDR ON , in-game go to an open place where you can see the sky , move the camera around and try to confirm there's a color banding happening only on the sky box 





i noticed it myself and i'm 99% sure it's the game not my display


----------



## axiumone

fleggy said:


> Thank you, that's great
> Other settings on default? (SDR, sRGB, Race mode, etc...)



Yeah, all other stuff is default.



kx11 said:


> anyone got shadow of the tomb raider ??
> 
> 
> if yes please check out this issue
> 
> 
> 
> set the display refresh to 98hz then enable HDR to get HDR10 , run the game and turn HDR ON , in-game go to an open place where you can see the sky , move the camera around and try to confirm there's a color banding happening only on the sky box
> 
> 
> 
> 
> 
> i noticed it myself and i'm 99% sure it's the game not my display



Yep, I can cofirm. I also see color banding on the skybox.


----------



## kx11

thank you sir


----------



## acmilangr

Anyone finally succeed to make Forza motorsport 7 works in hdr with correct colors?


----------



## kx11

acmilangr said:


> Anyone finally succeed to make Forza motorsport 7 works in hdr with correct colors?





just checked that out and it looked good , on HDR 10 that is 



i had to turn HDR on in the game then restart the game


----------



## smushroomed

Hello all, I'm coming from a acer x34 (3440x1440 100hz gsync) to this monitor and I have a few questions.

I'm 70%/30% comp fps vs triple A gaming, with general desktop usage otherwise. I mostly play r6: siege/OW and the latest game mixed in. I'm moving from 1080ti to 2080ti. I'm excited to have an HDR high fps monitor.

I would want to keep the monitor in 4:2:2 for 144HZ 4k gsync HDR for most of the time. is 4:2:2 on a desktop that bad?
Is it a pain to switch from HDR and SDR gaming back to desktop? 
It is a pain to change settings on the monitor?
Do I need to keep switching settings in windows 10?

How has HDR pc gaming output been? 
Have most triple games from Mass Effect: Andromeda been HDR capable?

Does it feel like a first gen product? I purchased the x34 on release date and it was great until now (3 years old, no real problems). 
I'm simply over ultrawide screen gaming. I vastly prefer high fps gaming instead, would this monitor be okay?


----------



## acmilangr

kx11 said:


> acmilangr said:
> 
> 
> 
> Anyone finally succeed to make Forza motorsport 7 works in hdr with correct colors?
> 
> 
> 
> 
> 
> 
> just checked that out and it looked good , on HDR 10 that is
> 
> 
> 
> i had to turn HDR on in the game then restart the game
Click to expand...

Are you sure? Hdr works but the colors are not as vivid as without hdr. Check for example a red car.


----------



## animeowns

profundido said:


> Now that I finished the game I have made some time to troubleshoot this  Turns out HDR (most likely in combination with nvidia drivers and my current old X27 firmware) causes this. When I turn HDR off in windows and play the game this issue is completely nonexistent and my monitor also doesn't display that "DP input detected" mode-switching thing.
> 
> I suppose this too might get solved by the new firmware at the end of the year and/or new drivers.


Asus has already stated they are releasing a firmware downloadable update sometime this year but if you want to get the update right now you have to ship your panel to them they won't charge you for the shipping but I will be patient and wait for the download utility.


----------



## acmilangr

I have connected ps4 pro with the monitor and I have the same problem.

On assasin creed odessy when I ACTIVATE HDR it has washed out colors like many games on pc.


----------



## dboythagr8

You have to turn RTSS for HDR to properly work in AC Odyssey just a FYI for the thread.

I'm running into a different problem. I play in HDR @ 98hz. I'm having an issue where it shows 98hz on the desktop, but as soon as I get into AC Odyssey, the panel goes to 120hz, even though I have 98hz selected in AC O? Has this happened to anybody else in a game?


----------



## acmilangr

dboythagr8 said:


> You have to turn RTSS for HDR to properly work in AC Odyssey just a FYI for the thread.
> 
> I'm running into a different problem. I play in HDR @ 98hz. I'm having an issue where it shows 98hz on the desktop, but as soon as I get into AC Odyssey, the panel goes to 120hz, even though I have 98hz selected in AC O? Has this happened to anybody else in a game?


What????? Was the RTSS the reason for washed out colors????? Really????


----------



## acmilangr

It works!!
Now forza motorsport 7 and origins works great without washed out colors. And yes the reasons was RTSS!!!!
Thanks mate


----------



## deadchip12

dboythagr8 said:


> You have to turn RTSS for HDR to properly work in AC Odyssey just a FYI for the thread.
> 
> I'm running into a different problem. I play in HDR @ 98hz. I'm having an issue where it shows 98hz on the desktop, but as soon as I get into AC Odyssey, the panel goes to 120hz, even though I have 98hz selected in AC O? Has this happened to anybody else in a game?


What is rtss?


----------



## acmilangr

deadchip12 said:


> dboythagr8 said:
> 
> 
> 
> You have to turn RTSS for HDR to properly work in AC Odyssey just a FYI for the thread.
> 
> I'm running into a different problem. I play in HDR @ 98hz. I'm having an issue where it shows 98hz on the desktop, but as soon as I get into AC Odyssey, the panel goes to 120hz, even though I have 98hz selected in AC O? Has this happened to anybody else in a game?
> 
> 
> 
> What is rtss?
Click to expand...

Rivatuner statistics


----------



## CallsignVega

acmilangr said:


> It works!!
> Now forza motorsport 7 and origins works great without washed out colors. And yes the reasons was RTSS!!!!
> Thanks mate


Oh really... I will have to investigate this. I wonder if some of the color issues I've been seeing with Win 10 HDR have been RTSS this whole time. I always use RTSS..


----------



## acmilangr

CallsignVega said:


> acmilangr said:
> 
> 
> 
> It works!!
> Now forza motorsport 7 and origins works great without washed out colors. And yes the reasons was RTSS!!!!
> Thanks mate
> 
> 
> 
> Oh really... I will have to investigate this. I wonder if some of the color issues I've been seeing with Win 10 HDR have been RTSS this whole time. I always use RTSS..
Click to expand...

Forza motorsport 7 (washed out colors) 
Assassin's creed origins (very dark colors) 
Injustice 2 (washed out colors) 
Call of duty Ww2 (washed out colors) 

All of them have been fixed by just closing RTSS.


----------



## fleggy

Hi everyone, RTSS can be used but you have to create a profile for AC:O with Stealth mode ON (maybe not neccessary) and Custom Direct3D support ON. Then HDR works even when RTSS is running.
BTW does anybody know where AC:Odyssey stores HDR Luminance values? They are reverted to defaults everytime the game starts.


----------



## acmilangr

I tried this on FM7 and it works. Thanks

Only one game now has problem with HDR. Hitman. It gives black screen on launch.


----------



## Morkai

dboythagr8 said:


> You have to turn RTSS for HDR to properly work in AC Odyssey just a FYI for the thread.
> 
> I'm running into a different problem. I play in HDR @ 98hz. I'm having an issue where it shows 98hz on the desktop, but as soon as I get into AC Odyssey, the panel goes to 120hz, even though I have 98hz selected in AC O? Has this happened to anybody else in a game?


In the nvidia panel you probably have refresh rate globally set to "highest available", that will override whatever you set ingame. Change it to application controlled.


----------



## kx11

acmilangr said:


> I tried this on FM7 and it works. Thanks
> 
> Only one game now has problem with HDR. Hitman. It gives black screen on launch.



yeah HITMAN doesn't run that good with HDR however i heard playing it in borderless mode can help


----------



## Fraizer

acmilangr said:


> I have a New Asus pg27uq and i can confirm you it has the New firmware
> -New option on osd about HDR logo (enable or disable)
> -Fixed Blackcrush at 144hz
> -much more SDR brightness. It seems like to be 600nits as tftcentral reported
> -better White, but maybe it is becouse of higher brightness
> 
> Ask anything you like Free



can you please tell me where you order it ? which country ? if amazon to amazon directly or an seller on amazon ?

thank for all your post on ths tread


----------



## Seyumi

Can anyone with SLI GTX 1000 series or SLI RTX 2000 series cards test GPU usage/FPS with G-Sync On/Off? I'm reading there's been a problem for over 2 years ever since the GTX 1000 series that using SLI + G-Sync on at the same time lowers overall GPU usage and results lower FPS than with G-Sync off. 

Since this monitor uses the newer G-Sync HDR module, I was wondering if this issue is still around or not. I'm about to upgrade to either the Predator X27 or the Asus version with SLI 2080Ti but I may not bother with the SLI if I'm going to get worse overall performance + the usual gripes & complaints with SLI.

This isn't the usual "well maybe the game isn't optimized for SLI" type of issue. It's straight up 99%/99% GPU usage on the same game with G-Sync off, then <90%/<90% GPU usage with G-Sync on type of issue.


----------



## axiumone

Seyumi said:


> Can anyone with SLI GTX 1000 series or SLI RTX 2000 series cards test GPU usage/FPS with G-Sync On/Off? I'm reading there's been a problem for over 2 years ever since the GTX 1000 series that using SLI + G-Sync on at the same time lowers overall GPU usage and results lower FPS than with G-Sync off.
> 
> Since this monitor uses the newer G-Sync HDR module, I was wondering if this issue is still around or not. I'm about to upgrade to either the Predator X27 or the Asus version with SLI 2080Ti but I may not bother with the SLI if I'm going to get worse overall performance + the usual gripes & complaints with SLI.
> 
> This isn't the usual "well maybe the game isn't optimized for SLI" type of issue. It's straight up 99%/99% GPU usage on the same game with G-Sync off, then <90%/<90% GPU usage with G-Sync on type of issue.


Actually, it looks like the issue may have been fixed on turing. I just tested on latest drivers and there doesn't seem to be a performance drop anymore with sli gsync.


----------



## jesyjames

axiumone said:


> Actually, it looks like the issue may have been fixed on turing. I just tested on latest drivers and there doesn't seem to be a performance drop anymore with sli gsync.


What about some of the SLI HDR issues people reported? Everything seem okay there?


----------



## bmgjet

bmgjet said:


> I bought mine day 1, But still filled in all the info they needed, Picture of serial number and screen shot of quote (accidently cropped out the date) so lets see if they email me a code lol.


Got code emailed today.


----------



## fleggy

Sorry, what code are you talking about?


----------



## bmgjet

fleggy said:


> Sorry, what code are you talking about?



COD BO4 Promotion for all ROG devices.


----------



## acmilangr

Newegg usa


----------



## acmilangr

Fraizer said:


> acmilangr said:
> 
> 
> 
> I have a New Asus pg27uq and i can confirm you it has the New firmware
> -New option on osd about HDR logo (enable or disable)
> -Fixed Blackcrush at 144hz
> -much more SDR brightness. It seems like to be 600nits as tftcentral reported
> -better White, but maybe it is becouse of higher brightness
> 
> Ask anything you like Free
> 
> 
> 
> 
> can you please tell me where you order it ? which country ? if amazon to amazon directly or an seller on amazon ?
> 
> thank for all your post on ths tread /forum/images/smilies/smile.gif
Click to expand...

Newegg USA


----------



## Glerox

Quick question for my fellow PG27UQ brothers.

Often when the monitor goes to sleep and I wake it up, it says no DP signal detected... and I have to hard restart my PC each time... 
Anyone had this issue and found a solution?

Thanks!


----------



## kx11

Glerox said:


> Quick question for my fellow PG27UQ brothers.
> 
> Often when the monitor goes to sleep and I wake it up, it says no DP signal detected... and I have to hard restart my PC each time...
> Anyone had this issue and found a solution?
> 
> Thanks!





this issue started with the october 2018 update for windows10 , before it it wasn't happening


----------



## Fraizer

acmilangr said:


> Newegg usa


great thank you

can you tell me which version you have ? me is J7 who is normaly jully 2018

compare to my 2 defect from june 2018 i have 2 new options on settings but dont know what to do with them ^^


----------



## MiniZaid

Seyumi said:


> Can anyone with SLI GTX 1000 series or SLI RTX 2000 series cards test GPU usage/FPS with G-Sync On/Off? I'm reading there's been a problem for over 2 years ever since the GTX 1000 series that using SLI + G-Sync on at the same time lowers overall GPU usage and results lower FPS than with G-Sync off.
> 
> Since this monitor uses the newer G-Sync HDR module, I was wondering if this issue is still around or not. I'm about to upgrade to either the Predator X27 or the Asus version with SLI 2080Ti but I may not bother with the SLI if I'm going to get worse overall performance + the usual gripes & complaints with SLI.
> 
> This isn't the usual "well maybe the game isn't optimized for SLI" type of issue. It's straight up 99%/99% GPU usage on the same game with G-Sync off, then <90%/<90% GPU usage with G-Sync on type of issue.


Gsync with HDR in battlefield 1 is dreadful performance. But using either one is fine.
Depends on the game.


----------



## acmilangr

Fraizer said:


> acmilangr said:
> 
> 
> 
> Newegg usa
> 
> 
> 
> great thank you
> 
> can you tell me which version you have ? me is J7 who is normaly jully 2018
> 
> compare to my 2 defect from june 2018 i have 2 new options on settings but dont know what to do with them ^^
Click to expand...

It says Jully 2018


----------



## Fraizer

we have the same then 

do you change the setting of those 2 new options in system setting ?

for the brightness you put it at 100 ? for me is too bright i set it at 80....

for the fan i notice is more loud than my screen of june 2018... :/

i am trying to found a way to replace it because is more loud than my computer.....


----------



## Fraizer

i change this monitor 3 times and till now i cant have more than 98Hz -_- -_-

uninstall all video driver with display cleaner uninstall in safe mode, update all windows 10 pro.... chang the DP cable, i update the DP firmware 1.0 from nvidia for my EVGA Titan X (not the xp) etc... and etc...


someone can tell me if the Titan X is compatible to have 120Hz or 144 hz ?... :/

oh i forget to say when i go to the area where we can choose to activate or disable the HDR this option ligne just dosent appear at all on windows 10 pro (last update windows 10 pro)


----------



## Glerox

kx11 said:


> this issue started with the october 2018 update for windows10 , before it it wasn't happening


Damn ok. I will update if I find a solution


----------



## acmilangr

Fraizer said:


> we have the same then /forum/images/smilies/smile.gif
> 
> do you change the setting of those 2 new options in system setting ?
> 
> for the brightness you put it at 100 ? for me is too bright i set it at 80....
> 
> for the fan i notice is more loud than my screen of june 2018... 😕
> 
> i am trying to found a way to replace it because is more loud than my computer.....


What options?
At day i put it on 60 or 70
At night about 50.

This monitor is really too bright!


----------



## Fraizer

on System setup -> DP SDR YCbCr sRGB Gamma you let by default ON or you put Off ?


yes now is very bright i am at 80


- the fan is not to louuuuud for you ? me yessss.... :/

- when i turn off the computer the fan of the monitor stay On even 3 hours later... for you too ?


----------



## acmilangr

Fraizer said:


> on System setup -> DP SDR YCbCr sRGB Gamma you let by default ON or you put Off ?
> 
> 
> yes now is very bright i am at 80
> 
> 
> - the fan is not to louuuuud for you ? me yessss.... 😕
> 
> - when i turn off the computer the fan of the monitor stay On even 3 hours later... for you too ?


Actually I don't know about this option too. I don't see any difference when I change it.

No. I don't hear the fans.


----------



## skingun

I hear the fan when the monitor is on but when I press the power button to turn it off, the fan stops after 5-10 minutes.


----------



## profundido

Fraizer said:


> i change this monitor 3 times and till now i cant have more than 98Hz -_- -_-
> 
> uninstall all video driver with display cleaner uninstall in safe mode, update all windows 10 pro.... chang the DP cable, i update the DP firmware 1.0 from nvidia for my EVGA Titan X (not the xp) etc... and etc...
> 
> 
> someone can tell me if the Titan X is compatible to have 120Hz or 144 hz ?... :/
> 
> oh i forget to say when i go to the area where we can choose to activate or disable the HDR this option ligne just dosent appear at all on windows 10 pro (last update windows 10 pro)


yes Fraizer, it's perfectly compatible: I have the X27 monitor in combination with 2 Titan X (Pascal) in SLI and it works to select 144hz in nvidia control panel although I prefer not to since my old firmware gives black crush etc at 144hz so I prefer to set mine at 120hz 8-bit RGB color.

Try the following: Select 8-bit color and RGB and press apply then ok to close the nvidia control panel. Open the nvidia control panel again and see if 144hz is available for selection now ?


----------



## profundido

Fraizer said:


> on System setup -> DP SDR YCbCr sRGB Gamma you let by default ON or you put Off ?
> 
> 
> yes now is very bright i am at 80
> 
> 
> - the fan is not to louuuuud for you ? me yessss.... :/
> 
> - when i turn off the computer the fan of the monitor stay On even 3 hours later... for you too ?


the loudness of the fan is hit and miss and seems to be different from monitor to monitor on both the Acer X27 and Asus PG27UQ

The firmware is supposed to keep the fan running until the temperature inside the monitor drops below threshold when it stops. This is usually within 10min. never 30min or longer unless you have your monitor up against a heater or something lol


----------



## stefxyz

Fraizer said:


> i change this monitor 3 times and till now i cant have more than 98Hz -_- -_-
> 
> uninstall all video driver with display cleaner uninstall in safe mode, update all windows 10 pro.... chang the DP cable, i update the DP firmware 1.0 from nvidia for my EVGA Titan X (not the xp) etc... and etc...
> 
> 
> someone can tell me if the Titan X is compatible to have 120Hz or 144 hz ?... :/
> 
> oh i forget to say when i go to the area where we can choose to activate or disable the HDR this option ligne just dosent appear at all on windows 10 pro (last update windows 10 pro)


I had it running on the Pacal Titan X (not p) before the 2080 ti and no issue. I can change between 120 and 98hz (of course 120 not available if I put 10bit and 444).

PS The new Tomb Raider looks spectacular on this monitor.


----------



## Fanu

profundido said:


> the loudness of the fan is hit and miss and seems to be different from monitor to monitor on both the Acer X27 and Asus PG27UQ
> 
> The firmware is supposed to keep the fan running until the temperature inside the monitor drops below threshold when it stops. This is usually within 10min. never 30min or longer unless you have your monitor up against a heater or something lol


I dont understand why the fan has to work after the monitor turns off ? 

monitor (or gsync module that is being cooled) after turning off will cool naturally over time - its not turned on anymore, why should electricity be wasted by actively cooling it still ?

fan still working after monitor shut down would only make sense if gsync module reached unsafe temps and needed to be cooled down to prevent hardware damage - but if gsync module can reach unsafe temps, in the first place, then that fan obviously isnt adequate


----------



## CallsignVega

It is because the heat-sinks are very small/thin to fit in a monitor chassis. If the fan isn't actively spinning on a tiny heat-sink, the temperature will spike as the stored heat load is still there after the monitor is turned off. 

Plenty of engineered items in the world have fans that still run after the device is turned off.


----------



## Fraizer

profundido said:


> yes Fraizer, it's perfectly compatible: I have the X27 monitor in combination with 2 Titan X (Pascal) in SLI and it works to select 144hz in nvidia control panel although I prefer not to since my old firmware gives black crush etc at 144hz so I prefer to set mine at 120hz 8-bit RGB color.
> 
> Try the following: Select 8-bit color and RGB and press apply then ok to close the nvidia control panel. Open the nvidia control panel again and see if 144hz is available for selection now ?


where in nvidia control panel i can select 8 bits i dont found this ^^

a screen of what i supposed to put before apply will be super great ^^


----------



## fleggy

Fraizer said:


> where in nvidia control panel i can select 8 bits i dont found this


IFRC you can find this under Desktop resolution (or similar entry in the tree). First set refresh rate to 98Hz (click Apply!) then switch colors from default to Nvidia and finally choose color depth (8/10)


----------



## Fraizer

like you can see after i choose nvidia color i cant change any option because is only 1 choice by zone and 98 Hz maximu :/

like i said i had this on 3 same asus

widnows 10 full update and i flash my titan XP with last firmware from nvidia for the display port

i change to the dp cable, the power suply of the monitor, i change the DP on the video card titan xp, last nvidia driver...

on my last asus monitor who was an 3d vision i had the 120hz with an dvi port without any issue


----------



## skingun

Select RGB colour. You've got it set to sub-sampling.


----------



## sblantipodi

I don't want to be a basher, this monitor is surely an interesting one but how to manage the wide gamut on this monitor?
I am comparing my EIZO S2433W wide gamut calibrated with a colorimeter with the Acer XB271HK standard gamut
and most of the software that are not managed, youtube, games ecc looks so off in colors on the wide gamut monitor.

How you manage the wide gamut on these monitors?


----------



## jesyjames

sblantipodi said:


> I don't want to be a basher, this monitor is surely an interesting one but how to manage the wide gamut on this monitor?
> I am comparing my EIZO S2433W wide gamut calibrated with a colorimeter with the Acer XB271HK standard gamut
> and most of the software that are not managed, youtube, games ecc looks so off in colors on the wide gamut monitor.
> 
> How you manage the wide gamut on these monitors?


You simply set the Display SDR input to sRGB or Wide Gamut. Does a good job.


----------



## sblantipodi

> You simply set the Display SDR input to sRGB or Wide Gamut. Does a good job.


I have seen a lot of sRGB presets on dozens of wide gamut monitor and no one where near to reduce the over saturation of the wide gamut panel.
How this can be different?

I have a calibrated eizo with colorimeter and I see terrible over saturation in games, YouTube, videos and all color unmanaged applications.


----------



## fleggy

No oversaturated colors here in sRGB mode. Same on my second monitor (HP LP2480zx Dreamcolor) in sRGB mode. I can see oversaturated colors in native (wide gamut) mode using sRGB ICC profile and unmanaged applications.


----------



## Fraizer

skingun said:


> Select RGB colour. You've got it set to sub-sampling.


hm i configured nothing it was by default like that.

where i go to select RGB ? is on nvidia or on monitor ?
on monitor i have -> information -> EOTF = SDR-sRGB and range = Limited range

on the nvicia control pannel when i select Nvidia color i cant modify any thing is only single choice for the 4 options = 32bits 8 bpc YcdCr422 limited.

if what you talking about is on this area where it display "YcdCr422" like you can see on my screenshot i just cant change it as i said i can choose nothing else... i dont know why... maybe ii need to unlock something somewhere ?

and i have a problem with HDR i dont see at all this option on display setting the option just dont appear (i make a screenshot too)

i already apply this update too sinc a moment https://www.nvidia.com/object/nv-uefi-update-x64.html

EDIT:

i change the resolution to 60 Hz and now is display RVB but not RGB (i think the french translate the G green to V vert.. my god...).... and 2 other options : YcdCr422 & YcdCr444. and have this too : 8bpc and now 10bpc. and limited or full.


----------



## Fraizer

profundido said:


> yes Fraizer, it's perfectly compatible: I have the X27 monitor in combination with 2 Titan X (Pascal) in SLI and it works to select 144hz in nvidia control panel although I prefer not to since my old firmware gives black crush etc at 144hz so I prefer to set mine at 120hz 8-bit RGB color.
> 
> Try the following: Select 8-bit color and RGB and press apply then ok to close the nvidia control panel. Open the nvidia control panel again and see if 144hz is available for selection now ?


ok then finaly i succed to select the RGB (in french RVB.... V= Vert green... crazy to translate that...) but to do that i put 60hz. to can select RGB...

But unfortunatly even closing the control panel and launch againn i can put an max of 98 Hz and when i do that i dont have any more choice to have the RGB... is select me automaticaly : "YcdCr422"

i set 144 Hz on on the firmware and turn it off = same result...

Me i need the 120 Hz i am not realy interested of the 144..

Hope with your help guys i can have my 120 hz... :/


----------



## fleggy

I would try a better DP cable. And did you update DP firmware in your graphic card?


----------



## Fraizer

i change the DP cable 3 times (2from the 2 asus monitor i had and 1 expensive lindy 1.4) and like i said i update the dp firmware with this https://www.nvidia.com/object/nv-uefi-update-x64.html

i change too the DP port on my Titan X


----------



## fleggy

It looks like your Titan X supports DP 1.2. Your results match DP 1.2 specification.


----------



## Fraizer

you mean 1.3 ?

@profundido said is working well with the titan X... and i update the firmware for the 1.4 and look succesful updated.

is an area or a software show i am actualy at DP 1.2 or 1.4 ?

thank you

oh and like i said the HDR dont display in option at all

i found this : "The Titan X is DisplayPort 1.2 certified, DisplayPort 1.3 and 1.4 ready enabling support for 4K display at 120hz, 5K displays at 60hz and 8k Displays at 60hz (with two cables)."


----------



## fleggy

I think you have the oldest Titan X model which supports only DP 1.2 according to NVidia specification. Check wiki and you will see that your card acts as DP 1.2
Could you test another card?


----------



## Fraizer

hi

where you saw is 1.2 on wiki ? because yes the card was launch at 1.2 but was writen 1.4 capable.

i found this where is clearly writen DP 1.3/1.4 and i bought mine like 6months after official launch of the titan x. https://nvidia.custhelp.com/app/ans...e-update-for-displayport-1.3-and-1.4-displays

unfortunatly i have only this card i mean a gamer card other are too old and not gamer


i found this too:
https://gtrusted.com/nvidia-titan-x

-->

""NVIDIA Titan X- First Graphics Card with DisplayPort 1.4 Support

The Titan X using NVIDIA's new Pascal GPU architecture is not only one of the most powerful graphics cards released, but it's one of the first products released in the market that support DisplayPort 1.4.

DisplayPort 1.4 supports 8.1 Gb/s per lane which at 4 lanes provides 32.4 Gb/s of throughput compared to the DisplayPort 1.2 which supports 5.4 Gb/s per lane. DisplayPort 1.4 also supports Display Stream Compression 1.2 to support 8K (7680×4320) at 60 hz with 10-bit color and HDR, or 4K (3840×2160) at 120 hz with 10-bit color and HDR. Don't expect DisplayPort 1.4 displays to come on line for a while but we'll be looking forward to test them with the NVIDIA Titan X.

The NVIDIA Titan X also supports HDMI 2.0 which only supports up to 4K at 60 hz resolutions and HDR.""


----------



## fleggy

Hi,

here you are: https://en.wikipedia.org/wiki/DisplayPort#Resolution_and_refresh_frequency_limits
Check your chip in GPU-Z. GM200-400 (first Titan X Maxwell) supports only DP 1.2


EDIT: you can check your current DP connection in GPU-Z on tab Advanced. For example my parameters (2080 ti, 4K, 120Hz, 8bit full RGB):

Link rate (current): 8.1Gbps
Link rate (max): 5.4Gbps
Lanes (current): 4
Lanes (max): 4
Color Format: RGB
Dynamic Range: VESA
Bit-per-Color: 8
etc...


----------



## Fraizer

hi

on your link we can see frequency but nothing about the titan x.

the second version of titan x is writen how ?

this is my screen shot


----------



## fleggy

Hi,

Your GPU is GM200 - Maxwell version of Titan X (the very first one). Newer Titan X (Pascal) and Titan Xp has GP102.
It is very simple:
GM100 - DP 1.2
GP102 - DP 1.4


----------



## sblantipodi

Is there someone that switched from a sRGB gamut display to a pg27uq that can tell me how to manage the over saturation of the wide gamut?


----------



## Fraizer

oh ****...

thank you fleggy


----------



## Fraizer

any body try to change the loud fan ? i have watercooling with MO-RA3 rad with 18fans and i dont hear them compare to the fan of the monitor...

https://www.cnews.cz/4k-144hz-gsync-hdr-asus-rog-pg278uq-rozborka


----------



## sblantipodi

Fraizer said:


> any body try to change the loud fan ? i have watercooling with MO-RA3 rad with 18fans and i dont hear them compare to the fan of the monitor...
> 
> https://www.cnews.cz/4k-144hz-gsync-hdr-asus-rog-pg278uq-rozborka


ridiculous design


----------



## Fraizer

cheap ***** for an crazy expensive monitor.... make a canal with tape...

if somebody can do better to integreat inside will be better cooling and silent


----------



## Glerox

sblantipodi said:


> Is there someone that switched from a sRGB gamut display to a pg27uq that can tell me how to manage the over saturation of the wide gamut?


You have to put the SDR display input setting to sRGB, otherwise colors are wrong/oversaturated because windows displays sRGB content.
Keep the wide gamut only for HDR content.


----------



## Kommando Kodiak

nevermind was resolved


----------



## NewType88

Fraizer said:


> any body try to change the loud fan ? i have watercooling with MO-RA3 rad with 18fans and i dont hear them compare to the fan of the monitor...
> 
> https://www.cnews.cz/4k-144hz-gsync-hdr-asus-rog-pg278uq-rozborka


I don’t think those type of fans are meant or can be found for silence ? I just hope they come up with a better solution for the 32 miniLED.


----------



## Fraizer

to replace it with another more silent

i want to unmount the screen but i dont see where exactly... i unmount many laptop but i know where unscrew and where to put pression on plastic to not make any damage. but here.....


----------



## acmilangr

Fraizer said:


> to replace it with another more silent
> 
> i want to unmount the screen but i dont see where exactly... i unmount many laptop but i know where unscrew and where to put pression on plastic to not make any damage. but here.....


It is really very easy to open this monitor.
After removing the bracket there is no screws. Just the plastic


----------



## kx11

sblantipodi said:


> Is there someone that switched from a sRGB gamut display to a pg27uq that can tell me how to manage the over saturation of the wide gamut?





i used to have a super hard time playing games more than an hour when i used " wide gamut " i'd be so tired and my eyes and head are strained as hell , switched to sRGB and the colors got quieter and more real if you ask but i'm much more relaxed when i did that while playing games ..etc


----------



## Fraizer

acmilangr said:


> It is really very easy to open this monitor.
> After removing the bracket there is no screws. Just the plastic


can you on a picture make are where i suposed to push ?

when you said is easy is because you did ?


----------



## acmilangr

Fraizer said:


> acmilangr said:
> 
> 
> 
> It is really very easy to open this monitor.
> After removing the bracket there is no screws. Just the plastic
> 
> 
> 
> can you on a picture make are where i suposed to push ?
> 
> when you said is easy is because you did ?
> 
> /forum/images/smilies/smile.gif
Click to expand...

Yes I did. I will try to send photo later

You need this to press between front and back
http://shine4ever.gr/s4e/wp-content/uploads/2012/05/guitar-pick-guitar-2.jpg


----------



## Fraizer

yes i have different type even for phone from ifix it ^^


----------



## Kommando Kodiak

Ran into a very unusual incident just now with this monitor. So I left my pc running while i went downstairs to watch the world series and eat pizza for say 2 hours or so. I came back to the screen shut off but the pc running, upon waking the monitor the graphics were borked -- scrunched together with a vertical pattern to the lettering NOT ~~graphical striated colors~~ or things like that, its just related to the text scrunching it, ahh it might be one of the plane filter switchers not activating or frozen in position. Anyhow thats not the weird bit! 

So I figure ill be a smart guy by switching resolutions causing the monitor to play around with its scaler and resolve the issue without doing a restart. Well when I switched to a lower resolution instead of getting a refreshed picture it went black so while this is odd ill just wait the 30 seconds for the monitor to revert. 

IT doesnt revert.

Ok, so I restart by hitting the power button on the motherboard, it restarts screen still black. I change over to HDMI-- black. I power cycle the monitor -- black. So finally I try to test the HDMI via my xbox one S, as soon as it powers on -- screen functions.

Bizarre


----------



## CallsignVega

This monitor doesn't have a scalar. Sounds like your GPU or Windows caused this issue honestly.


----------



## bee144

I turned off HDR in Windows 10 (1809) and now the colors are saturated. I’ve got 32 bit color/ 8 bit output/ RGB color/ Full dynamic range selected.

Any idea why this is happening?


----------



## kx11

bee144 said:


> I turned off HDR in Windows 10 (1809) and now the colors are saturated. I’ve got 32 bit color/ 8 bit output/ RGB color/ Full dynamic range selected.
> 
> Any idea why this is happening?





in the monitor OSD go to : system setup > display SDR input > check sRGB 



it should be good now


----------



## CaliLife17

Just got my Monitor (replacing X27), and has a screen defect. There is some dark spot on the screen, no matter what color slide I put up. not sure if its dust or something, but it doesn't look like a pixel issue (x27 had pixel issues).

EDIT: And found a couple of stuck/dead pixels as well. Going to exchange it, and hope for the best on the new one. 

You can see it next to the pointer (upper left of pointer)


----------



## kmetek

replace it, for such price monitor should be perfect-


----------



## TheShape1977

bee144 said:


> I turned off HDR in Windows 10 (1809) and now the colors are saturated. I’ve got 32 bit color/ 8 bit output/ RGB color/ Full dynamic range selected.
> 
> Any idea why this is happening?



You have to select ‘use default colour settings’ and apply in nvidia CP. Then select again your nvidia colour settings and apply and voila your colours will be back to normal. This works every time for me when it happens.


----------



## CaliLife17

kmetek said:


> replace it, for such price monitor should be perfect-


Yup that is the plan. Already went through two X27 monitors, but seems QC is still an issue. Hopefully next one will be good.


----------



## animeowns

Has anyone sent in there monitor to asus to do the color crush update when running the display @ 4k 144hz there suppose to release a firmware to harmonize the colors when running at 144hz refresh.


https://www.asus.com/us/support/FAQ/1036750


----------



## Rob w

I’ve just picked up the PA329q , same here 27” is small for 4K,if it had been 32” I would have gone for it also.


----------



## Kommando Kodiak

So I had the glitch again, heres what it looks like as taken with a phone it actually looks clear through the phone than in person, so to see the full effect open the image and zoom in


----------



## bmgjet

Thats windows DPI bug.
Happened on my old screen as well until I disabled the screen going to sleep in the power management.


----------



## toncij

Rob w said:


> I’ve just picked up the PA329q , same here 27” is small for 4K,if it had been 32” I would have gone for it also.


No HDR and no more than 60Hz model? A wholly different market...


----------



## smushroomed

I'm on my second pg27uq and the fan still stays on after it has been turned off for hours. Is there anything I can do to make sure it stays off?

I've e-mailed asus about my first monitor and they want met to send the monitor in, which I don't want to do. I'm hoping its an easily fixed firmware issue.

Is it the port I'm using? I haven't tried HDMI only.
Is it the cable I'm using? I've tried different display cables
Is it using the USB ports on the monitor? I've unplugged and not used them
Is it the graphics card? I have a 2080ti, I could try using my max xi hero display port
Is it the manufacturing date? Both monitors have been made in Aug 2018 (was there a box revision as well?)
Is it ANY settings in the OSD? HDMI sleep? SRGB gamut?, Brightness? Nits? FALD?

For the price I paid for this monitor it should be near perfect

How hard is it to open the monitor up and replace the fan? Is it a high quality sunon fan?


----------



## Fanu

smushroomed said:


> For the price I paid for this monitor it should be near perfect


I have a hard time thinking of a first gen bleeding edge product that had near zero flaws
issues were to be expected, especially when it comes to gaming monitors which are some of the most flawed hardware products in recent times (every one of them has some sort of an issue)

imo return the monitor and wait for 2nd or even 3rd gen if you want a good product


----------



## saltedham

smushroomed said:


> I'm on my second pg27uq and the fan still stays on after it has been turned off for hours. Is there anything I can do to make sure it stays off?
> 
> I've e-mailed asus about my first monitor and they want met to send the monitor in, which I don't want to do. I'm hoping its an easily fixed firmware issue.
> 
> Is it the port I'm using? I haven't tried HDMI only.
> Is it the cable I'm using? I've tried different display cables
> Is it using the USB ports on the monitor? I've unplugged and not used them
> Is it the graphics card? I have a 2080ti, I could try using my max xi hero display port
> Is it the manufacturing date? Both monitors have been made in Aug 2018 (was there a box revision as well?)
> Is it ANY settings in the OSD? HDMI sleep? SRGB gamut?, Brightness? Nits? FALD?
> 
> For the price I paid for this monitor it should be near perfect
> 
> How hard is it to open the monitor up and replace the fan? Is it a high quality sunon fan?


not sure if it requires all 3 but when i have deep sleep displayport/hdmi and eco mode on the fan will turn off after maybe 20 minutes when the monitor is turned off. helps me sleep. the monitor is only a few feet away from my head and i can easily hear it at night.


----------



## Glerox

smushroomed said:


> I'm on my second pg27uq and the fan still stays on after it has been turned off for hours. Is there anything I can do to make sure it stays off?
> 
> I've e-mailed asus about my first monitor and they want met to send the monitor in, which I don't want to do. I'm hoping its an easily fixed firmware issue.
> 
> Is it the port I'm using? I haven't tried HDMI only.
> Is it the cable I'm using? I've tried different display cables
> Is it using the USB ports on the monitor? I've unplugged and not used them
> Is it the graphics card? I have a 2080ti, I could try using my max xi hero display port
> Is it the manufacturing date? Both monitors have been made in Aug 2018 (was there a box revision as well?)
> Is it ANY settings in the OSD? HDMI sleep? SRGB gamut?, Brightness? Nits? FALD?
> 
> For the price I paid for this monitor it should be near perfect
> 
> How hard is it to open the monitor up and replace the fan? Is it a high quality sunon fan?


You have to enable DP and HDMI deepsleep and the fan will stop after 5 to 15 minutes.


----------



## fleggy

Same here - DP deepsleep ON, ECO mode OFF and the fan stops after several minutes.


----------



## toncij

So is it a failure? Or the monitor is actually good for the money?


----------



## fleggy

What do you mean "a failure"?


----------



## kot0005

when is this firmware update coming ? does anyone know ?


----------



## Morkai

kot0005 said:


> when is this firmware update coming ? does anyone know ?


Asus said later this year.. so probably summer if lucky?


----------



## CallsignVega

Actually it's NVIDIA firmware on the G-Sync module. I expect the only way we will get it fixed is to send in our monitors. This is probably so far down on NVIDIA's priority list it has already fall by the wayside.


----------



## acmilangr

I have problem on HDR. Most games have washed out colors. Origins, odessay, battlefield V. Red is something like orange, and everything is washed out. I can easy see the difference when i choose SDR, everything are correct. 

And no, i don't run the RTSS. I Uninstalled it.


Any help please?


----------



## CallsignVega

acmilangr said:


> I have problem on HDR. Most games have washed out colors. Origins, odessay, battlefield V. Red is something like orange, and everything is washed out. I can easy see the difference when i choose SDR, everything are correct.
> 
> And no, i don't run the RTSS. I Uninstalled it.
> 
> 
> Any help please?


This is a well known problem. Windows 10 HDR is still broken. The only "work around" I've found is to increase NVIDIA control panel color saturation to around 65%. It may not be perfect (may clip some bright colors), but it looks a whole lot better than washed out HDR. Give it a try and report back.


----------



## fleggy

acmilangr said:


> I have problem on HDR. Most games have washed out colors. Origins, odessay, battlefield V. Red is something like orange, and everything is washed out. I can easy see the difference when i choose SDR, everything are correct.
> 
> And no, i don't run the RTSS. I Uninstalled it.
> 
> 
> Any help please?


Did you install the correct ICM file? I have to reinstall it and set calibration profiles after each driver update. BTW no problem with HDR games (Odyssey, Origins, FC5, Strange Brigade).


----------



## CallsignVega

What icm file? You aren't supposed to have to load an .icm for this monitor. That also wouldn't explain why the monitor looks proper in SDR but washed out in HDR. My C8 also has the same washed out HDR. Fresh windows and NVIDIA driver install too.


----------



## fleggy

I reinstall the monitor using .INF file (you can find it on CD-ROM) after each driver update to change generic PNP monitor in Device Manager. Sometimes the color profile is not set properly and without correct color profile my HDR colors are wrong. Just check what ICM file is set in your default color profile.


----------



## toncij

Has anyone tried to mod those monitors to water-cool them?


----------



## kot0005

fleggy said:


> I reinstall the monitor using .INF file (you can find it on CD-ROM) after each driver update to change generic PNP monitor in Device Manager. Sometimes the color profile is not set properly and without correct color profile my HDR colors are wrong. Just check what ICM file is set in your default color profile.


ICM ?? U mean ICC profile ?


----------



## fleggy

Yes, I mean ICC profile "ASUS ROG PG27U Color Profile,D6500" stored in file "ROG PG27U.ICM"


----------



## Fraizer

acmilangr said:


> It is really very easy to open this monitor.
> After removing the bracket there is no screws. Just the plastic


 @acmilangr

any news ?


----------



## churgy95

PG27UQ or X27 ??? Its currently discounted, and the X27 is 200 Eur cheaper. Which one should i go with?


----------



## kx11

the cheaper one


----------



## churgy95

Why so?


----------



## animeowns

CallsignVega said:


> Actually it's NVIDIA firmware on the G-Sync module. I expect the only way we will get it fixed is to send in our monitors. This is probably so far down on NVIDIA's priority list it has already fall by the wayside.


If I ever get refunded for my pending broken tv issue I would happily send in my pg27uq to be the guinea pig for testing the firmware update I am gonna buy the C8 65. and buy an updated warranty with it like 3 year. By the way vega did you ever get the wasabi mango 4k 120hz panel yet ?


----------



## sblantipodi

OLED IS THE FUTURE 
https://www.rapidtvnews.com/2018112...-screen-burn-in-prevention.html#axzz5Y56wsaTt


----------



## skingun

sblantipodi said:


> OLED IS THE FUTURE /forum/images/smilies/biggrin.gif
> https://www.rapidtvnews.com/2018112...-screen-burn-in-prevention.html#axzz5Y56wsaTt


I've been waiting for OLED technology to advance enough for desktop use since mid 2000's. I gave up waiting and bought the PG27UQ.

At this point I think micro LED is the way forward but expect it to be a long time before this technology appears in our computer monitors.

Fast refresh rates are a must for me.


----------



## kx11

this display got a driver 





https://www.asus.com/Monitors/ROG-SWIFT-PG27UQ/HelpDesk_Download/




i have no idea how to use it though


----------



## kot0005

kx11 said:


> this display got a driver
> 
> 
> 
> 
> 
> https://www.asus.com/Monitors/ROG-SWIFT-PG27UQ/HelpDesk_Download/
> 
> 
> 
> 
> i have no idea how to use it though


update it manually from device manager and then ur monitor. This just looks like a usb driver update.


----------



## fleggy

kot0005 said:


> update it manually from device manager and then ur monitor. This just looks like a usb driver update.


Hmmm, I don't think so. IMHO the HLKX file is probably just an output from Windows HLK (Hardware Lab Kit). Other files are the same as the files which can be found on CD.


----------



## animeowns

kx11 said:


> this display got a driver
> 
> 
> 
> 
> 
> https://www.asus.com/Monitors/ROG-SWIFT-PG27UQ/HelpDesk_Download/
> 
> 
> 
> 
> i have no idea how to use it though


I installed it now it doesn't fix the black color crush when running the display @ 144hz still the firmware update is suppose to harmonize the colors this seems to be just a icc profile along with a driver so device manager can recognize it 


https://www.asus.com/support/FAQ/1036750/


----------



## SmoothD

toncij said:


> Has anyone tried to mod those monitors to water-cool them?


Does anyone actually mod the fan on these? is it posible? or any other way to get rid of the fan sound?. I got the monitor one week ago and i cant stand the sound the fan makes, any suggestions? Any solutions? i would love to keep it but the fan is killing my ears. I live in Colombia, got the monitor but cant return it now or exchange because this one was the only unit they had.

thanks for any help or guidance


----------



## iamjanco

kot0005 said:


> ICM ?? U mean ICC profile ?


They're not necessarily one in the same: *ICC vs. ICM*; but they both do pretty much the same job.

Oops, didn't notice the date on that post I was responding to, my apologies.


----------



## Fraizer

SmoothD said:


> Does anyone actually mod the fan on these? is it posible? or any other way to get rid of the fan sound?. I got the monitor one week ago and i cant stand the sound the fan makes, any suggestions? Any solutions? i would love to keep it but the fan is killing my ears. I live in Colombia, got the monitor but cant return it now or exchange because this one was the only unit they had.
> 
> thanks for any help or guidance


same problem here...

i asked how to unmount it to acess to the fan and to found another with more airflow and less noise. and to change probably the bad aluminium rad with an copper one...


----------



## skingun

Fraizer said:


> same problem here...
> 
> i asked how to unmount it to acess to the fan and to found another with more airflow and less noise. and to change probably the bad aluminium rad with an copper one...



Keep us updated.


----------



## Fraizer

skingun said:


> Keep us updated.




it is an member who is on this topic who promise me to show me how to unmount the screen proprely... if i dont have answer i will not try to unmount it


----------



## SmoothD

skingun said:


> Keep us updated.





Fraizer said:


> it is an member who is on this topic who promise me to show me how to unmount the screen proprely... if i dont have answer i will not try to unmount it


waiting for any update, if you can provide the info here on the forum or PM me, that would be awesome.


----------



## Fraizer

i am till now (since 10-25-2018) waiting news/photos from @acmilangr to unmount the monitor ^^ i will not try to unmount without the areas where to push... i dont want to let mark or broke the plastic



acmilangr said:


> Yes I did. I will try to send photo later
> 
> You need this to press between front and back
> http://shine4ever.gr/s4e/wp-content/uploads/2012/05/guitar-pick-guitar-2.jpg



after with your help to choose the right fans i can order from amazon france or uk like 10 differents fans for laptop to select the best one in noise / cooling perf


For sure this fan will be more noisy and the monitor more hot with the dust who will stick on the fan and the aluminium radiator... mean even those for who there fan version is for the moment not so noisy will try to found an solution...

this expensive monitor as new is already very hot... with the dust accumulation hope it will not turn bad for the electronics...


----------



## pat182

hey guys, got the pg27uq and its amazing, but anyone knows what the sensor on the top frame of the screen is for ?!?!?


----------



## bmgjet

pat182 said:


> hey guys, got the pg27uq and its amazing, but anyone knows what the sensor on the top frame of the screen is for ?!?!?


 auto brightness


----------



## pat182

ahh I see, now in games like AC odyssey and other you have to choose the max NIT of your display, should it be put @ 1000 nit or more ? getting @ 1400 does increase brightness but im not sure its ok to surpass the max spec


----------



## fleggy

Hi, I prefer 1000. Lower values make the sun ugly and higher values oversaturate bright areas so you lose details in them.


----------



## SmoothD

Fraizer said:


> i am till now (since 10-25-2018) waiting news/photos from @acmilangr to unmount the monitor ^^ i will not try to unmount without the areas where to push... i dont want to let mark or broke the plastic
> 
> 
> 
> 
> after with your help to choose the right fans i can order from amazon france or uk like 10 differents fans for laptop to select the best one in noise / cooling perf
> 
> 
> For sure this fan will be more noisy and the monitor more hot with the dust who will stick on the fan and the aluminium radiator... mean even those for who there fan version is for the moment not so noisy will try to found an solution...
> 
> this expensive monitor as new is already very hot... with the dust accumulation hope it will not turn bad for the electronics...


sad to hear those news, have been almost 2 months waiting for the photos and no sign of the member, hope he shows someday, Is this the guy who removed the antiglare coat from the screen? 

Ive been wondering, even if you dont leave a mark after openning the monitor, wont Asus be able to find out by other marks that the user opened the screen? Like screw marks or something?

Thx for your feedback, im considering selling the monitor right now, hopw would be a solution for these inconvinince.


----------



## Fraizer

no acmilangr is posting on this same topic but look he dont answer to that or maybe finaly he dont know where to do this exactly.

i dont mind of the warranty is for me i dont want to put mark or broke the plastic  but they willl not check the screws etc 

no if i am not wrong is not him who change the filter


----------



## jesyjames

Well, this is disappointing. Seems like my FALD is misbehaving. It's completely unusable now on fast. I have to turn it off. It's not adjusting anymore... See the attached shots. Any ideas? I've tried doing the reset in the menu and power cycling.


----------



## fleggy

I have the same problem with broken FALD. I got it once during update to W10 October 1809. And MGS V: Phantom Pain also triggers this issue. AFAIK it should be fixed in the latest firmware. Do you have the old one or the new one? The new FW has the option to turn off the annoying HDR notification when HDR is activated.


----------



## pat182

wow that suck, anyone have problem like mine ?

desktop in HDR mode no problem , love the brightness boost

if I switch to desktop SDR mode, colors are super weird like wrong saturation/contrast

EDIT: i think G sync in borderless solved the problem ?!?! idk


----------



## bmgjet

Im happy Iv had 0 issues with my screen.
No far noise.
No funny colours or FALD.

Just the 700 less points in time spy when the screen is plugged in but thats sort of expected when driving a 4K high refresh rate screen.


----------



## jesyjames

fleggy said:


> I have the same problem with broken FALD. I got it once during update to W10 October 1809. And MGS V: Phantom Pain also triggers this issue. AFAIK it should be fixed in the latest firmware. Do you have the old one or the new one? The new FW has the option to turn off the annoying HDR notification when HDR is activated.


Seem to of fixed mine by pulling the power cable. Odd, but glad it's working again.


----------



## SmoothD

Fraizer said:


> no acmilangr is posting on this same topic but look he dont answer to that or maybe finaly he dont know where to do this exactly.
> 
> i dont mind of the warranty is for me i dont want to put mark or broke the plastic  but they willl not check the screws etc
> 
> no if i am not wrong is not him who change the filter


Ive been searching for fans, found this options (replacement parts for the sunnon specs in pcperspective review):

https://es.aliexpress.com/item/Free...sk-800-G1-DM-K2U75PA-cpu-fan/32770602499.html

https://www.ebay.com/itm/NEW-HP-PRO...4-747932-001-768424-001-CPU-Fan-/153169783002

https://es.aliexpress.com/item/New-...o-blower-fan-one-machine-MSI/32809239157.html

Do you think any of those would make any difference been replacement models?


----------



## Fraizer

hi

i dont know we need to test them to know if they make less noise. this why i was thinking to order like 10 from an amazon in europe to test to keep the best one and return the bad.

but actualy i am blocked to how to open the case of the screen without broking the plasctic or making any scratch

here are 3 links of list of fans if you found close specifications you can select like 10 (select with Prime amazon option). i am not sure if it need a precise specification of the sunon because when i see how they mount that it look more than an hame DIY.. than an profesional compagny mount...

https://www.amazon.fr/s/s/ref=sr_nr...ortable&ie=UTF8&qid=1543520161&rnid=437877031

https://www.amazon.fr/s/s/ref=sr_nr...ortable&ie=UTF8&qid=1543520146&rnid=437877031

https://www.amazon.fr/s/ref=sr_nr_n...ortable&ie=UTF8&qid=1543520128&rnid=340859031


after that we need to found an copper plate (will need to cut it) to replace the actual very cheap aluminium one. i dont know if amazon europe have that...


----------



## Fanu

such a high premium device and buyers have to resort to manually replacing fans ? and still no promised firmware update ?

hopefully they massively improve second gen, cause this first gen looks like a failure


----------



## fleggy

I wouldn't call it a failure. I don't know a better monitor for gaming. BTW even my 10 years old HP LP2480zx has a fan


----------



## Fanu

I meant failure in terms of support

such a stupid expensive product (for its intended purpose - gaming) receiving 0 support this long after launch is shameful really

just goes to show how little acer/asus/AU optronics/Nvidia give a damn


----------



## Seyumi

Well, I think the pricing / issues are a bit less extreme when those 32" 4k 60hz PC monitors first hit stores. Those were $3500 and had issues such as using MST (which essentially had 2 inputs being displayed on 1 screen but could come out of sync and stuff). $1700~$2000 for a next-generation cutting edge PC monitor doesn’t seem so bad compared to that. 

I trusted by brain instead of my heart and decided to hold off on these first-generation of Gsync HDR monitors (even though I usually always have the best PC gaming monitor) and it looks like my instincts paid off. Patiently awaiting an improved 32" 4k model or the 34"~35" widescreen 1440p version hopefully sometime in 2019.


----------



## Fraizer

Fanu said:


> such a high premium device and buyers have to resort to manually replacing fans ? and still no promised firmware update ?
> 
> hopefully they massively improve second gen, cause this first gen looks like a failure


Totaly agree with you is just a big shame especialy when you look like how they fix this cheap fan and glued an chan for air with tape...


----------



## eux

I just picked one of these up and so far the monitor is amazing but I'm having one big issue with it that is driving me crazy. Still contemplating if I need to RMA this unit or not. If anyone could help me out I'd greatly appreciate it. 

No matter what settings I try I have dimming around the whole border of my monitor. The effect is quite exaggerated in the attached image on white background but it is always there no matter what I'm doing. Updated GPU drivers/W10 using DP1.4 on a compatible GPU. I'm guessing this is just a backlight bleed issue? Really have no clue what to do. Thanks in advance.


----------



## rvectors

eux said:


> I just picked ones of these up and so far the monitor is amazing but I'm having one big issue with it that is driving me crazy. Still contemplating if I need to RMA this unit or not. If anyone could help me out I'd greatly appreciate it.
> 
> No matter what settings I try I have dimming around the whole border of my monitor. The effect is quite exaggerated in the attached image on white background but it is always there no matter what I'm doing. Updated GPU drivers/W10 using DP1.4 on a compatible GPU. I'm guessing this is just a backlight bleed issue? Really have no clue what to do. Thanks in advance.


What’s it like on a pure black background. Haven’t read the whole thread, so might be a known issue but that’s the only way to check for BLB.

I’m viewing on a tiny screen but the dimmed border looks perfectly uniform. I would think this makes it a design issue, or manufacturing fault.


----------



## eux

Thank you for the reply. Heres a pic I took of the BLB the other night with FALD off and low brightness.


----------



## acmilangr

I am still experiencing problems on HDR. Colors on most games like Assasins creed odyssey are washed out. Also I have tied HDR movies and the colors are also washed out. There are no
deep Colors. 

Try to download transformers last knight on both versions (SDR and HDR) and compare them. You will. See that SDR have better colors. It seems like there ia no wide color gamut when HDR is enabled.
The only way to make them vibrant like SDR is to change these options on nvidia panel 

Brightness :55
Contrast: 55
Vibrant :65

But this is a way I don't like to do. I want the colors to be native the way must be by default.

I think that all this is couse Windows 10.


----------



## acmilangr

Fraizer said:


> no acmilangr is posting on this same topic but look he dont answer to that or maybe finaly he dont know where to do this exactly.
> 
> i dont mind of the warranty is for me i dont want to put mark or broke the plastic /forum/images/smilies/smile.gif but they willl not check the screws etc /forum/images/smilies/wink.gif
> 
> no if i am not wrong is not him who change the filter


Hey I am really sorry. I had many days to come on this forum

Just press with a guitar pick around the monitor on the place I marked. After that the front bessel will be out. 

https://i.imgur.com/o40Dcm8.jpg


----------



## bmgjet

Is your refresh rate set too high? 96hz is the max for HDR 10bit.


----------



## acmilangr

bmgjet said:


> Is your refresh rate set too high? 96hz is the max for HDR 10bit.


Ι tried also on 96hz.


----------



## kx11

tried ECO mode , this monitor from performing like a 2k $ monitor to a 50$ monitor , stuttering all over the games and GSYNC doesn't work


----------



## saltedham

kx11 said:


> tried ECO mode , this monitor from performing like a 2k $ monitor to a 50$ monitor , stuttering all over the games and GSYNC doesn't work


i noticed when the monitor shut down with eco mode on and the fan stopped. turning it back on would sometimes disable gsync and had to restart pc to get it back on. i leave eco mode off now. 

im more bothered that the service menu? will sometimes activate it. feels like being in a horror movie when you turn your monitor on and red/blue/green screens flashing.


----------



## Fraizer

acmilangr said:


> Hey I am really sorry. I had many days to come on this forum
> 
> Just press with a guitar pick around the monitor on the place I marked. After that the front bessel will be out.
> 
> https://i.imgur.com/o40Dcm8.jpg


no worry i am in the same case ^^

yes i know where are the 2 parts ^^ but i requested from you the areas where is pluged to the other part of the plastic to make pression only on the right areas to reduce the possibility to scratch or broke the plastic 
and if they are screw to remove before.

without that i will not take the risk ^^


----------



## istudy92

OHHH I found a PG27UQ community...why hasn't a "Official PG27UQ" page been made?? You all LAZY. (lol..yes I know..why am I not making it? WHO KNOWs)


----------



## tinykitten

I assume the promised firmware update tool is a no go? I don't exactly feel like shipping a 2500€ monitor to Asus. It took me long enough to not have the fan hit a cable or whatever caused clicking noises and god knows how that monitor might return to me, I don't feel like risking it being in a worse state than before.


----------



## kx11

saltedham said:


> will sometimes activate it. feels like being in a horror movie when you turn your monitor on and red/blue/green screens flashing.





that didn't happen to me , not yet at least


----------



## betam4x

All these messages regarding the monitor needing to be larger are full of it. I prefer high PPI over size. Ideally I'd like to have both. 24" 4k, 27" 5k, 32" 8k, or something along those lines. To those that claim to not be able to tell the difference, have you actually compared a 1080p 24" monitor to a 4k 24" monitor or TV? You can quickly tell the difference in gaming, even if both games were framelocked and set at the same detail level. You can also tell the difference with any vector based graphics or text sharpness. If we had it your way, all our phones would be 480p or less still. 4k @ 24".

A 24" 4k monitor is around 183 ppi
A 24" 1080p monitor is 92 ppi
The LG V-30 is 538 ppi
The iPhone XS Max is 458 ppi
The ancient Galaxy S3 is 306 ppi

I'm not for smaller monitors, but I am for higher resolutions as monitors grow larger. My 25" 1440p monitors are around 117 ppi, and while it's a definite improvement over 1080p, it's still much lower than I'd prefer. I'd rather have dual 5k 27" over dual 25" 144p displays. I can see the difference.


----------



## fleggy

acmilangr said:


> I am still experiencing problems on HDR. Colors on most games like Assasins creed odyssey are washed out. Also I have tied HDR movies and the colors are also washed out. There are no
> deep Colors.
> 
> Try to download transformers last knight on both versions (SDR and HDR) and compare them. You will. See that SDR have better colors. It seems like there ia no wide color gamut when HDR is enabled.
> The only way to make them vibrant like SDR is to change these options on nvidia panel
> 
> Brightness :55
> Contrast: 55
> Vibrant :65
> 
> But this is a way I don't like to do. I want the colors to be native the way must be by default.
> 
> I think that all this is couse Windows 10.



HDR works very well in AC:Origins/Odyssey for me. Here is my settings:

- NVCP: Display/Change Resolution
3820x2160/120Hz
Use Default color settings (32-bit, RGB, 8 bpc, range full)

- in-game setting
Window mode: fullscreen
HDR ON, luminance max 1000, min 80

Do not forget to turn HDR ON in W10 Display settings before you start the game.
And the last note - install the display driver including color profiles and check that your default ICC profile is correctly set to "ASUS ROG PG27U Color Profile,D6500"


----------



## acmilangr

fleggy said:


> acmilangr said:
> 
> 
> 
> I am still experiencing problems on HDR. Colors on most games like Assasins creed odyssey are washed out. Also I have tied HDR movies and the colors are also washed out. There are no
> deep Colors.
> 
> Try to download transformers last knight on both versions (SDR and HDR) and compare them. You will. See that SDR have better colors. It seems like there ia no wide color gamut when HDR is enabled.
> The only way to make them vibrant like SDR is to change these options on nvidia panel
> 
> Brightness :55
> Contrast: 55
> Vibrant :65
> 
> But this is a way I don't like to do. I want the colors to be native the way must be by default.
> 
> I think that all this is couse Windows 10.
> 
> 
> 
> 
> HDR works very well in AC:Origins/Odyssey for me. Here is my settings:
> 
> - NVCP: Display/Change Resolution
> 3820x2160/120Hz
> Use Default color settings (32-bit, RGB, 8 bpc, range full)
> 
> - in-game setting
> Window mode: fullscreen
> HDR ON, luminance max 1000, min 80
> 
> Do not forget to turn HDR ON in W10 Display settings before you start the game.
> And the last note - install the display driver including color profiles and check that your default ICC profile is correctly set to "ASUS ROG PG27U Color Profile,D6500"
Click to expand...

No it is not correct. If you try it without HDR and compare you will see that colors are washed out. Just compare for example red on HDR and the same on HDR off.
On HDR is like orange. The same thing happens on HDR movies. It is something like SRGB color space.

Just try with the settings I told before and you will see how must be HDR

Brightness :55
Contrast: 55
Vibrant :65


----------



## fleggy

Well, all my HDR games work correctly so it must be an artist's intention or a bug in AC:O 
Frankly speaking, I like HDR look of AC:O.


----------



## acmilangr

fleggy said:


> Well, all my HDR games work correctly so it must be an artist's intention or a bug in AC:O /forum/images/smilies/smile.gif
> Frankly speaking, I like HDR look of AC:O.


No they don't 
😛
You think that it works fine but it doesn't.

Just try on my settings and compare the colors

HDR needs wide color gamut. And as default it doesn't on windows 10


----------



## Glerox

Treat yourselves a favor and watch this video on your PG27UQ :






Happy new year!


----------



## fleggy

acmilangr said:


> ...
> HDR needs wide color gamut. And as default it doesn't on windows 10



So you are saying that HDR does not work correctly in ALL games? And nobody else has not noticed that by now? Reviewers, testers, common users... they all are blind?


----------



## acmilangr

fleggy said:


> acmilangr said:
> 
> 
> 
> ...
> HDR needs wide color gamut. And as default it doesn't on windows 10
> 
> 
> 
> 
> So you are saying that HDR does not work correctly in ALL games? And nobody else has not noticed that by now? Reviewers, testers, common users... they all are blind?
Click to expand...

Except about 2 games yes, I think they didn't notice. I have found many forums/articles about this problem. There are members also on this topic that have the same problem. Even on OLED TV. The problem is not on monitors/TV but in windows 10.

I will repeat this. HDR is not only higher brightness and contrast, it is about Wide Color Gamut. And becouse it doesn't works property the colors is like 100%SRGB. There is no deep color Red or Blue or green. Even on HDR films. 

Sorry for my bad english


----------



## acmilangr

CallsignVega said:


> acmilangr said:
> 
> 
> 
> I have problem on HDR. Most games have washed out colors. Origins, odessay, battlefield V. Red is something like orange, and everything is washed out. I can easy see the difference when i choose SDR, everything are correct.
> 
> And no, i don't run the RTSS. I Uninstalled it.
> 
> 
> Any help please?
> 
> 
> 
> This is a well known problem. Windows 10 HDR is still broken. The only "work around" I've found is to increase NVIDIA control panel color saturation to around 65%. It may not be perfect (may clip some bright colors), but it looks a whole lot better than washed out HDR. Give it a try and report back.
Click to expand...

Here you will see another member that see the same problem.


----------



## fleggy

acmilangr said:


> Except about 2 games yes, I think they didn't notice. I have found many forums/articles about this problem. There are members also on this topic that have the same problem. Even on OLED TV. The problem is not on monitors/TV but in windows 10.
> 
> I will repeat this. HDR is not only higher brightness and contrast, it is about Wide Color Gamut. And becouse it doesn't works property the colors is like 100%SRGB. There is no deep color Red or Blue or green. Even on HDR films.
> 
> Sorry for my bad english


I still think that something is wrong on your side. Have you watched some 4K HDR videos? I have and all colors (including red) are brilliant (e.g. LG: Chess HDR from https://4kmedia.org/tag/hdr/).

Windows 10 64-bit 1809
2080ti connected via DP, driver 417.35
Desktop: Full RGB, 8-bit per color, HDR ON
video player: VLC 3.0.5


----------



## hmcindie

fleggy said:


> I still think that something is wrong on your side. Have you watched some 4K HDR videos? I have and all colors (including red) are brilliant (e.g. LG: Chess HDR from https://4kmedia.org/tag/hdr/).


I think he is just used to super saturated colors. So the orange he is complaining about in AC is actually how it's supposed to look and he wants it to look red.


----------



## hmcindie

edit: whops double message.


----------



## acmilangr

fleggy said:


> acmilangr said:
> 
> 
> 
> Except about 2 games yes, I think they didn't notice. I have found many forums/articles about this problem. There are members also on this topic that have the same problem. Even on OLED TV. The problem is not on monitors/TV but in windows 10.
> 
> I will repeat this. HDR is not only higher brightness and contrast, it is about Wide Color Gamut. And becouse it doesn't works property the colors is like 100%SRGB. There is no deep color Red or Blue or green. Even on HDR films.
> 
> Sorry for my bad english
> 
> 
> 
> I still think that something is wrong on your side. Have you watched some 4K HDR videos? I have and all colors (including red) are brilliant (e.g. LG: Chess HDR from https://4kmedia.org/tag/hdr/).
> 
> Windows 10 64-bit 1809
> 2080ti connected via DP, driver 417.35
> Desktop: Full RGB, 8-bit per color, HDR ON
> video player: VLC 3.0.5
Click to expand...

These demo video works fine.


----------



## acmilangr

hmcindie said:


> fleggy said:
> 
> 
> 
> I still think that something is wrong on your side. Have you watched some 4K HDR videos? I have and all colors (including red) are brilliant (e.g. LG: Chess HDR from https://4kmedia.org/tag/hdr/).
> 
> 
> 
> I think he is just used to super saturated colors. So the orange he is complaining about in AC is actually how it's supposed to look and he wants it to look red.
Click to expand...

No. You are wrong. Just try the game without HDR and see some Red colors, then try it on HDR and you will them orange.


----------



## fleggy

acmilangr said:


> These demo video works fine.


OK, then the "orange bug" is not a Windows bug (as you wrote) but an AC:O feature.
A short quotation from AC:O patch 1.1.0 changelog (https://assassinscreed.ubisoft.com/...s-creed-odyssey–1-1-0-and-1-1-1-patch-notes):
- [PC] Improved HDR functionality on PC.

Could you place here a screenshot with the name of the location? I'll check it on my side.


----------



## pat182

for me i have to watch hdr video of youtube in EDGE cause chrome hdr is broken


----------



## Ferreal

I’m guessing that firmware update is vapor now. I’ll just wait for a revised 4K 144hz model.

It took 2 years for Asus to release pg279q to replace pg278q.


----------



## pat182

is it just me or i feel lately the monitor is less bright ?, was there a patch or something that broke hdr


----------



## bmgjet

pat182 said:


> is it just me or i feel lately the monitor is less bright ?, was there a patch or something that broke hdr


Been a few windows patches trying to fix HDR stuff.
Video play back in VLC has been fixed. SDR stuff with HDR turned on in windows use to be blindly bright. Now its about the same as its ment to be when in SDR.


----------



## acmilangr

I discovered that if you set the monitor on 144hz then the fans are really loud.


----------



## axiumone

So, mine bit the dust after a couple of months. It looks like half the display is interlaced and keeps flickering. It's not the cables or the gpu.


----------



## pat182

bmgjet said:


> Been a few windows patches trying to fix HDR stuff.
> Video play back in VLC has been fixed. SDR stuff with HDR turned on in windows use to be blindly bright. Now its about the same as its ment to be when in SDR.


yea i was kinda liking the high brightness tbh, its very sunny where i live


----------



## SmoothD

Fraizer said:


> no worry i am in the same case ^^
> 
> yes i know where are the 2 parts ^^ but i requested from you the areas where is pluged to the other part of the plastic to make pression only on the right areas to reduce the possibility to scratch or broke the plastic
> and if they are screw to remove before.
> 
> without that i will not take the risk ^^


Have you proceed to open the monitor? did you replace/ order the fans from amazon? i would appreciate any update,


----------



## animeowns

axiumone said:


> So, mine bit the dust after a couple of months. It looks like half the display is interlaced and keeps flickering. It's not the cables or the gpu.


aw that sucks I'm gonna sale mine while it still works fine and just buy a oled tv while I wait on the alienware oled monitor


----------



## SmoothD

acmilangr said:


> Hey I am really sorry. I had many days to come on this forum
> 
> Just press with a guitar pick around the monitor on the place I marked. After that the front bessel will be out.
> 
> https://i.imgur.com/o40Dcm8.jpg


Do you have other posts with detailed instructions? is just that corner that needs the pickguard push? or the same place in the other corner has the same release mechanism?

thanks for the pic. i just wonder if that pressure point releases the back of the monitor to access the fan or the front bezzel aftare witch one can continue to dismantle the monitor.


----------



## Fraizer

ok i was waiting to see if at the CES it will be an replacement in 30 / 32"of this monitor to replace my 27UQ and it look No only the 32 at 120hz...

then i will focus to solve the problem of this crazy fan noise...

if some of you are interested to work on this problem please let me know...

need people for:

- unmount there back monitor, feedback where to push exactly
- take a precise dimenssion of the fan, how it fit on the pcb or on the hit sink
- take the dimensions of the aluminium heatsink and how it fit
- how air flow go (i saw is an cheap plastic channel...)

replacement:
- FAN : search somebody who have some knwolege on laptop fan (the sunon on this monitor is like an laptop fan).
- Copper heatsink : any link on amazon


----------



## sprayingmango

Glerox said:


> Treat yourselves a favor and watch this video on your PG27UQ :
> 
> https://www.youtube.com/watch?v=N1-Jmq7BLFE&t=76s
> 
> Happy new year!


Beautiful video, thanks for this!


----------



## Glerox

Fraizer said:


> ok i was waiting to see if at the CES it will be an replacement in 30 / 32"of this monitor to replace my 27UQ and it look No only the 32 at 120hz...


Where did you saw a 32 inches 4K 120Hz monitor? Model name please?


----------



## Fraizer

Glerox said:


> Where did you saw a 32 inches 4K 120Hz monitor? Model name please?


my mistake it was an 2560x1440 i mix with the XG438Q... :/


look no people interested to make the monitor silent ^^


----------



## tinykitten

I reached out to ASUS support asking about the promised firmware update tool, stating that I have no interest to send back my monitor to update the firmware. Their reply was to request a RMA (they'd send a replacement unit and I'd send back my current monitor - "defect LCD" according to ASUS, their words - once I received the replacement), completely dodging anything regarding that firmware update tool. Not sure if I want to risk getting a replacement I haven't seen in person, having a bunch of dead pixels and such like my first PG27UQ.


----------



## Glerox

tinykitten said:


> I reached out to ASUS support asking about the promised firmware update tool, stating that I have no interest to send back my monitor to update the firmware. Their reply was to request a RMA (they'd send a replacement unit and I'd send back my current monitor - "defect LCD" according to ASUS, their words - once I received the replacement), completely dodging anything regarding that firmware update tool. Not sure if I want to risk getting a replacement I haven't seen in person, having a bunch of dead pixels and such like my first PG27UQ.


same here... I have a good unit I want to keep it... man what a bummer to wait for that firmware update...


----------



## tinykitten

Glerox said:


> same here... I have a good unit I want to keep it... man what a bummer to wait for that firmware update...


Well I doubt it's getting released at all tbh.


----------



## deadchip12

Does anyone notice raised black in Bethesda games? In Doom, the pause menu background is near black instead of black. In Wolfenstein 2, in cutscenes, letterbox bars are only very near black instead of black. I'm pretty sure there's nothing wrong with my settings because other games don't have this problem. And I think this issues are only present on pc; I checked the Doom demo in PS4 and the pause menu background is properly black.


----------



## SmoothD

Fraizer said:


> ...
> look no people interested to make the monitor silent ^^


i am interested but dont have a clear plan of action: how to procee? or if it is even posible to open the monitor witouth leaving marks? Wich cooler of solutions really makes the fan silent?


----------



## Fraizer

SmoothD said:


> i am interested but dont have a clear plan of action: how to procee? or if it is even posible to open the monitor witouth leaving marks? Wich cooler of solutions really makes the fan silent?


hi

https://www.overclock.net/forum/27800026-post2866.html


----------



## acmilangr

SmoothD said:


> acmilangr said:
> 
> 
> 
> Hey I am really sorry. I had many days to come on this forum
> 
> Just press with a guitar pick around the monitor on the place I marked. After that the front bessel will be out.
> 
> https://i.imgur.com/o40Dcm8.jpg
> 
> 
> 
> Do you have other posts with detailed instructions? is just that corner that needs the pickguard push? or the same place in the other corner has the same release mechanism?
> 
> thanks for the pic. i just wonder if that pressure point releases the back of the monitor to access the fan or the front bezzel aftare witch one can continue to dismantle the monitor.
Click to expand...

You must press it all around the monitor.


----------



## SmoothD

acmilangr said:


> You must press it all around the monitor.


Does the procedure left any marks if using the pickguard? Did you find any stickers on the monitor(some monitors have)? Thanks


----------



## acmilangr

SmoothD said:


> acmilangr said:
> 
> 
> 
> You must press it all around the monitor.
> 
> 
> 
> Does the procedure left any marks if using the pickguard? Did you find any stickers on the monitor(some monitors have)? Thanks
Click to expand...

No it doesn't leave any mark. 


I have a problem on my monitor, fans never stops. Even on windows. 
Is it normal?


----------



## Kaltenbrunner

About how much were 32" 1440p IPS 144Hz at launch, when was that ? I still can't justify the money for the top models now. Maybe in another 2 years I will. But for now I'll get 32" 1440p VA 144Hz or 27" 1440p IPS 144Hz


----------



## Scotty99

Heads up, this is on sale today for 1499.99 at the egg.


----------



## SmoothD

acmilangr said:


> No it doesn't leave any mark.
> 
> I have a problem on my monitor, fans never stops. Even on windows.
> Is it normal?


Its as far as i know, doesnt matter the input or refresh rate, the fan on my unitn is always running


----------



## Glerox

I'm sick of not being able to play 144Hz in SDR... that freakin firmware update never comes... i think I'll just send it to Asus and let you know!


----------



## badjz

Glerox said:


> I'm sick of not being able to play 144Hz in SDR... that freakin firmware update never comes... i think I'll just send it to Asus and let you know!


I purchased mine back in July. I requested a DOA with Asus in December as a result of the black crush. They reluctantly approved it & I got a brand new one with the updated firmware. Try your luck


----------



## Glerox

badjz said:


> I purchased mine back in July. I requested a DOA with Asus in December as a result of the black crush. They reluctantly approved it & I got a brand new one with the updated firmware. Try your luck


I did send a request today. We'll see, thanks.


----------



## kot0005

Anyone with Metro Exodus able to test if HDR works with DLSS ? People are reporting that it doesnt.


----------



## kot0005

Okay, I cant confirm, HDR is a No go with DLSS ON.


----------



## istudy92

kot0005 said:


> Okay, I cant confirm, HDR is a No go with DLSS ON.


I never saw any "HDR" settings. I find it silly no HDR is promoted.


----------



## acmilangr

istudy92 said:


> kot0005 said:
> 
> 
> 
> Okay, I cant confirm, HDR is a No go with DLSS ON.
> 
> 
> 
> I never saw any "HDR" settings. I find it silly no HDR is promoted.
Click to expand...

Turn it on first from windows settings, then run the game. You will see then the option on the game. 

But this is another one game like assasins creed with washed out colors when HDR is enabled.....


----------



## acmilangr

kot0005 said:


> Okay, I cant confirm, HDR is a No go with DLSS ON.


Forget DLSS. Except that it doesn't work with HDR, it has bad result. 

It is way better to choose 1440p resolution than 2160p/DLSS


----------



## toncij

So, had anyone tried to mod the fan? Water block? Better fan?


----------



## kot0005

istudy92 said:


> I never saw any "HDR" settings. I find it silly no HDR is promoted.


Its a console port, so you need to turn it on in Windows first.


----------



## kot0005

acmilangr said:


> Forget DLSS. Except that it doesn't work with HDR, it has bad result.
> 
> It is way better to choose 1440p resolution than 2160p/DLSS


yes I Dont use DLSS . I am using 0.8 resolution scale instead. Its the last option in the menu's, they call it shading power.

HDR works fine for me without DLSS, can u post ss's of how ur HDR is washed out ?


----------



## acmilangr

kot0005 said:


> acmilangr said:
> 
> 
> 
> Forget DLSS. Except that it doesn't work with HDR, it has bad result.
> 
> It is way better to choose 1440p resolution than 2160p/DLSS
> 
> 
> 
> yes I Dont use DLSS . I am using 0.8 resolution scale instead. Its the last option in the menu's, they call it shading power.
> 
> HDR works fine for me without DLSS, can u post ss's of how ur HDR is washed out ?
Click to expand...

Screenshot can't taken using HDR. 
I will try tomorrow give some photos from my phone.
Just look the subtitles on game. Without HDR they have orange color. With HDR it is washed out like everything else.


----------



## deadchip12

The monitor has been out for a while. Could some owners provide their overall experience so far? Are you impressed with the hdr performance? Are there any aspect that annoy you? I made a mistake of skipping this monitor and purchasing an oled tv for gaming/movies but in the last few months I realize gaming on a tv has proven to be uncomfortable for me mostly due to my eyesight (I'm near-sighted and wearing glasses and the big size of the tv and the long distance from me to the tv cause me eyestrain and headache)


----------



## tinykitten

deadchip12 said:


> The monitor has been out for a while. Could some owners provide their overall experience so far? Are you impressed with the hdr performance? Are there any aspect that annoy you? I made a mistake of skipping this monitor and purchasing an oled tv for gaming/movies but in the last few months I realize gaming on a tv has proven to be uncomfortable for me mostly due to my eyesight (I'm near-sighted and wearing glasses and the big size of the tv and the long distance from me to the tv cause me eyestrain and headache)


 In terms of image quality and HDR (assuming proper HDR content such as RE7/RE2 or Jacob + Katie Schwarz videos for example) the experience is great and best in class (along with the X27) for the time being I believe. Aside from what the monitor is used for my experience has been less than average I would say. Fan noises are there but not overly annoying in my case however when I received my second monitor a cable would hit the fan and make annoying clicking noises. I managed to solve that issue somehow but it left a not so good impression considering I returned my first monitor due to a bunch of dead pixel. Being an early adopter I'm left with the old firmware for the time being as I don't want to ship the monitor to Asus to replace the firmware or RMA and exchange the monitor due to the aforementioned issues I personally had. My monitor is fine right now aside from the firmware so I don't want to risk going full circle and get a new exchanged monitor with more dead pixel or things like that. I reached out to Asus asking about their promised firmware update tool and the conversation I had with their support was straight up a joke. 



So all in all image quality is amazing, everything else is meh for me. I don't think I care enough anymore to say that I would buy it again provided I could go back in time or something, no games on the horizon that interest me at all. And if it's content aside from games I might as well watch that on my TV. This is obviously a personal experience, I think if you can get a monitor without any dead pixel or other issues you'll like it considering how much of a step up the image quality is compared to everything else currently available on the monitor market.


----------



## deadchip12

tinykitten said:


> In terms of image quality and HDR (assuming proper HDR content such as RE7/RE2 or Jacob + Katie Schwarz videos for example) the experience is great and best in class (along with the X27) for the time being I believe. Aside from what the monitor is used for my experience has been less than average I would say. Fan noises are there but not overly annoying in my case however when I received my second monitor a cable would hit the fan and make annoying clicking noises. I managed to solve that issue somehow but it left a not so good impression considering I returned my first monitor due to a bunch of dead pixel. Being an early adopter I'm left with the old firmware for the time being as I don't want to ship the monitor to Asus to replace the firmware or RMA and exchange the monitor due to the aforementioned issues I personally had. My monitor is fine right now aside from the firmware so I don't want to risk going full circle and get a new exchanged monitor with more dead pixel or things like that. I reached out to Asus asking about their promised firmware update tool and the conversation I had with their support was straight up a joke.
> 
> 
> 
> So all in all image quality is amazing, everything else is meh for me. I don't think I care enough anymore to say that I would buy it again provided I could go back in time or something, no games on the horizon that interest me at all. And if it's content aside from games I might as well watch that on my TV. This is obviously a personal experience, I think if you can get a monitor without any dead pixel or other issues you'll like it considering how much of a step up the image quality is compared to everything else currently available on the monitor market.


Thank you for your reply. I have a few more questions that I hope you can provide some answers for:

-Does the blooming bother you at all? Will watch it in a fairly lit room help mitigate most of the blooming? That's the aspect I am worried the most about, especially coming from oled which is blooming-free.
-Are hdr contents bright enough? On my oled, small spectacular highlights are pretty impressively bright but I feel that the bigger highlights e.g. explosions, the sunny sky are less dazzling due to aggresive abl.
-I think an user here say the monitor become less bright the more you use it? Maybe some problem with the fald unit? Do you experience anything similar?
-I agree RE7/RE2's overall hdr implementation is still among the most impressive, except the black level. I tried the game on my oled and cannot get the game to display pitch black. I think this is pointed out in several videos online. Do you notice the same thing on the monitor?


----------



## tinykitten

deadchip12 said:


> Thank you for your reply. I have a few more questions that I hope you can provide some answers for:
> 
> -Does the blooming bother you at all? Will watch it in a fairly lit room help mitigate most of the blooming? That's the aspect I am worried the most about, especially coming from oled which is blooming-free.
> -Are hdr contents bright enough? On my oled, small spectacular highlights are pretty impressively bright but I feel that the bigger highlights e.g. explosions, the sunny sky are less dazzling due to aggresive abl.
> -I think an user here say the monitor become less bright the more you use it? Maybe some problem with the fald unit? Do you experience anything similar?
> -I agree RE7/RE2's overall hdr implementation is still among the most impressive, except the black level. I tried the game on my oled and cannot get the game to display pitch black. I think this is pointed out in several videos online. Do you notice the same thing on the monitor?


-Blooming is there but it's not bothering me to a point where I get distracted or annoyed by it. I used to be quite picky with things related to monitor stuff (BLB on PG279Q/348Q etc). This could vary case by case due to different ambient lighting and such. I didn't notice it too much playing RE7/RE2 in a complete dark room, neither did I search for blooming or paid much attention to it though. 
-I feel HDR brightness is fine, I wouldn't say I'm an expert though. I like to use Jacob + Katie Schwarz videos on YT for HDR references, or chess demo and to me all those videos look fantastic in HDR in terms of brightness/contrast and such.
-I haven't had any problems regarding the monitor becoming less bright, or degrading over time to generalize it. 
-In terms of RE games I believe this is actually an art direction choice. I could be wrong, without spoilering I feel that certain areas in both games display pitch blacks just fine while other areas may have more of a grey/brown tone to them.


----------



## acmilangr

HDR is great on this monitor. Just check tftcentral review. It has maximum 1200nits. Mine has the latest firmware. It is ready bright even on SDR (about 550nits acoarding to tftcentral) 

This monitor is great for me. The only issue is blooming but it doesn't bother me. 

OLED are great, but they don't have 120hz and gsync. This means that the only way to be smooth is to set it on vsync and have minimum 60fps, if it goes under 60(for example 59) then you actually see 30fps and this is really bad for me. If you close vsync then it creates tearing. This is bug issue for me. 


Also I can't play on so big display (55 inch ). I could prefer 32 inch but 27 is also great in the distance I play. PPI is really great.


----------



## Glerox

deadchip12 said:


> The monitor has been out for a while. Could some owners provide their overall experience so far? Are you impressed with the hdr performance? Are there any aspect that annoy you? I made a mistake of skipping this monitor and purchasing an oled tv for gaming/movies but in the last few months I realize gaming on a tv has proven to be uncomfortable for me mostly due to my eyesight (I'm near-sighted and wearing glasses and the big size of the tv and the long distance from me to the tv cause me eyestrain and headache)


I've tried 12 monitors in the last two years and this is hands down the best gaming monitor out there if you have at least a 1080 TI or unless you do professional gaming.
I was waiting for the firmware update that what supposed to be user upgradeable but it's not coming out...
It pissed me off that I couldn't play at 144Hz in Apex Legends (stuck to 120Hz) so I said FU$%? IT, I sent it back to Asus for a firmware update.
We'll see how it goes... I will do a video review update when I receive it with the new firmware so stay tuned!

On the negative side, blooming is a bit annoying in dark games but we're at least two years before high frame rate mini LEDs monitors.
Also the fan is annoying too but that's the price to pay for a premium experience...


----------



## NewType88

Glerox said:


> deadchip12 said:
> 
> 
> 
> The monitor has been out for a while. Could some owners provide their overall experience so far? Are you impressed with the hdr performance? Are there any aspect that annoy you? I made a mistake of skipping this monitor and purchasing an oled tv for gaming/movies but in the last few months I realize gaming on a tv has proven to be uncomfortable for me mostly due to my eyesight (I'm near-sighted and wearing glasses and the big size of the tv and the long distance from me to the tv cause me eyestrain and headache)
> 
> 
> 
> I've tried 12 monitors in the last two years and this is hands down the best gaming monitor out there if you have at least a 1080 TI or unless you do professional gaming.
> I was waiting for the firmware update that what supposed to be user upgradeable but it's not coming out...
> It pissed me off that I couldn't play at 144Hz in Apex Legends (stuck to 120Hz) so I said FU$%? IT, I sent it back to Asus for a firmware update.
> We'll see how it goes... I will do a video review update when I receive it with the new firmware so stay tuned!
> 
> On the negative side, blooming is a bit annoying in dark games but we're at least two years before high frame rate mini LEDs monitors.
> Also the fan is annoying too but that's the price to pay for a premium experience...
Click to expand...

. 

2 years before mini led ? I thought a 32” mini led monitor was coming out this year ?


----------



## Glerox

NewType88 said:


> Glerox said:
> 
> 
> 
> 
> 
> deadchip12 said:
> 
> 
> 
> The monitor has been out for a while. Could some owners provide their overall experience so far? Are you impressed with the hdr performance? Are there any aspect that annoy you? I made a mistake of skipping this monitor and purchasing an oled tv for gaming/movies but in the last few months I realize gaming on a tv has proven to be uncomfortable for me mostly due to my eyesight (I'm near-sighted and wearing glasses and the big size of the tv and the long distance from me to the tv cause me eyestrain and headache)
> 
> 
> 
> I've tried 12 monitors in the last two years and this is hands down the best gaming monitor out there if you have at least a 1080 TI or unless you do professional gaming.
> I was waiting for the firmware update that what supposed to be user upgradeable but it's not coming out...
> It pissed me off that I couldn't play at 144Hz in Apex Legends (stuck to 120Hz) so I said FU$%? IT, I sent it back to Asus for a firmware update.
> We'll see how it goes... I will do a video review update when I receive it with the new firmware so stay tuned!
> 
> On the negative side, blooming is a bit annoying in dark games but we're at least two years before high frame rate mini LEDs monitors.
> Also the fan is annoying too but that's the price to pay for a premium experience...
> 
> Click to expand...
> 
> .
> 
> 2 years before mini led ? I thought a 32” mini led monitor was coming out this year ?
Click to expand...

I said high refresh rate mini LEDs 😉
I mean a gaming mini LEDs monitor. Not before 2 years IMO.


----------



## NewType88

Glerox said:


> NewType88 said:
> 
> 
> 
> 
> 
> Glerox said:
> 
> 
> 
> 
> 
> deadchip12 said:
> 
> 
> 
> The monitor has been out for a while. Could some owners provide their overall experience so far? Are you impressed with the hdr performance? Are there any aspect that annoy you? I made a mistake of skipping this monitor and purchasing an oled tv for gaming/movies but in the last few months I realize gaming on a tv has proven to be uncomfortable for me mostly due to my eyesight (I'm near-sighted and wearing glasses and the big size of the tv and the long distance from me to the tv cause me eyestrain and headache)
> 
> 
> 
> I've tried 12 monitors in the last two years and this is hands down the best gaming monitor out there if you have at least a 1080 TI or unless you do professional gaming.
> I was waiting for the firmware update that what supposed to be user upgradeable but it's not coming out...
> It pissed me off that I couldn't play at 144Hz in Apex Legends (stuck to 120Hz) so I said FU$%? IT, I sent it back to Asus for a firmware update.
> We'll see how it goes... I will do a video review update when I receive it with the new firmware so stay tuned!
> 
> On the negative side, blooming is a bit annoying in dark games but we're at least two years before high frame rate mini LEDs monitors.
> Also the fan is annoying too but that's the price to pay for a premium experience...
> 
> Click to expand...
> 
> .
> 
> 2 years before mini led ? I thought a 32” mini led monitor was coming out this year ?
> 
> Click to expand...
> 
> I said high refresh rate mini LEDs 😉
> I mean a gaming mini LEDs monitor. Not before 2 years IMO.
Click to expand...

Yes, high refresh rate. According to tftcentral high refresh rate IPS release outlook there is supposed to be a 144hz 32” mini led monitor this year. But you will probably be right in the end, with how much stuff gets delayed.


----------



## Glerox

can you post the link please?


----------



## skupples

too small. 4k should be 32+

love the size of my asus 10bit 34 4k.


----------



## NewType88

@Glerox. Here ya go. http://www.tftcentral.co.uk/articles/high_refresh_rate.htm#ips


----------



## Zenairis

skupples said:


> too small. 4k should be 32+
> 
> love the size of my asus 10bit 34 4k.


I have to disagree until aliasing is almost not noticeable at all which is near 8k at 27”. That’s still not enough to make the pixels too small for the human eye to see. I’m running a X27 4K 27” and I still want a higher res on a 27” screen. I’m not exactly a fan of ultra wides though.


----------



## skupples

true, but how close do you really sit to your screen? 

I just had to move, and am currently less than 18 inches from my screen, n its waaaay too much. Adding another foot (sitting on the edge of the bed) makes it perfect.

aliasing isn't really a cut and dry flat line you can measure.


also, that whole 8K statements reminds me quite a bit of what was said about VR sets, more so than screens.


----------



## Zenairis

skupples said:


> true, but how close do you really sit to your screen?
> 
> I just had to move, and am currently less than 18 inches from my screen, n its waaaay too much. Adding another foot (sitting on the edge of the bed) makes it perfect.
> 
> aliasing isn't really a cut and dry flat line you can measure.
> 
> 
> also, that whole 8K statements reminds me quite a bit of what was said about VR sets, more so than screens.


I sit about 2-3 feet from my monitor which is pretty close but it’s the sweet spot for this size.


----------



## kot0005

Zenairis said:


> I have to disagree until aliasing is almost not noticeable at all which is near 8k at 27”. That’s still not enough to make the pixels too small for the human eye to see. I’m running a X27 4K 27” and I still want a higher res on a 27” screen. I’m not exactly a fan of ultra wides though.


I dunno about 8k on 27inches but 4k on 27 ich is def a huge upgrade over 1440p. You dont need a 32 inch scree, it will be bigger tho.

Dont believe anyone who says 27inch 4k's suck. they dont.


----------



## Zenairis

kot0005 said:


> Zenairis said:
> 
> 
> 
> I have to disagree until aliasing is almost not noticeable at all which is near 8k at 27”. That’s still not enough to make the pixels too small for the human eye to see. I’m running a X27 4K 27” and I still want a higher res on a 27” screen. I’m not exactly a fan of ultra wides though.
> 
> 
> 
> I dunno about 8k on 27inches but 4k on 27 ich is def a huge upgrade over 1440p. You dont need a 32 inch scree, it will be bigger tho.
> 
> Dont believe anyone who says 27inch 4k's suck. they dont.
Click to expand...

I’d plus one this post. As I had previously mentioned the X27 is an outstanding monitor and given how much trouble they’re having producing all of the high end gaming monitors it might be a long time until a worthy successor comes out. AUO had originally discussed launching a 4K/144 32” IPS micro LED with over 1000 backlights, however they have not even managed to get the PG35VQ or the X35 on shelves yet.


----------



## toncij

Zenairis said:


> I’d plus one this post. As I had previously mentioned the X27 is an outstanding monitor and given how much trouble they’re having producing all of the high end gaming monitors it might be a long time until a worthy successor comes out. AUO had originally discussed launching a 4K/144 32” IPS micro LED with over 1000 backlights, however they have not even managed to get the PG35VQ or the X35 on shelves yet.


Both Asus and Acer fail actually delivering promised and advertised, so I wouldn't hold my breath. If a monitor is announced for 2019, expect it 2020 at earliest and most probably 2021.


----------



## skupples

i don' think anyone worth listening to said 4k 27 sucks, just that some of us prefer a slightly larger screen. 

Yeah, I need to be at least 2-3 feet from my ProArt Asus 4K, but that's more due to viewing angles when being too close.


----------



## kot0005

skupples said:


> i don' think anyone worth listening to said 4k 27 sucks, just that some of us prefer a slightly larger screen.
> 
> Yeah, I need to be at least 2-3 feet from my ProArt Asus 4K, but that's more due to viewing angles when being too close.


I have seen a lot of people call 4k on 27inch a gimmick lol.


----------



## deadchip12

So I'm testing the monitor before deciding whether to purchase it or not. I can't seem to set the reference white nits option right. At 52 hdr contents look pretty dim, bright highlights do not stand out much. Even dimmer than my oled c8 at home if I remember correctly. The brightness test video doesn't work either as 400nits and 1000 nits look the same. If I turn it up way over 80 then it looks bright as my oled when dynamic tone mapping is on


----------



## Glerox

It's been a week since i sent my Asus monitor for firmware update. Hope it wont be too long...


----------



## kot0005

Glerox said:


> It's been a week since i sent my Asus monitor for firmware update. Hope it wont be too long...


plz update us. 1 week to flask firmware on a $2k monitor is a joke.


----------



## Glerox

kot0005 said:


> plz update us. 1 week to flask firmware on a $2k monitor is a joke.


Yes will do. It is indeed a joke. I'm hoping to receive it this week damn.
I've been playing Apex on an old 1080p 60Hz monitor for the last 2 weeks LOL.


----------



## acmilangr

Does anyone have fixed the washed out colors on HDR games? Devil may cry 5 also have much better colors on SDR. Just look the red logo on start screen. On SDR is red, if you enable HDR it is like orange....


----------



## deadchip12

acmilangr said:


> Does anyone have fixed the washed out colors on HDR games? Devil may cry 5 also have much better colors on SDR. Just look the red logo on start screen. On SDR is red, if you enable HDR it is like orange....


I don’t play DmC. What other hdr games do you have problem with? I can test on both the monitor and my oled tv to see if I have the same problem with colors


----------



## Morkai

acmilangr said:


> Does anyone have fixed the washed out colors on HDR games? Devil may cry 5 also have much better colors on SDR. Just look the red logo on start screen. On SDR is red, if you enable HDR it is like orange....


I have never had that issue, (release-firmware), all hdr content has always looked perfect for me colorwise.
(netflix hdr, destiny2, battlefield v, anthem, some demos, probably other games i forget - haven't tried dmc5)


----------



## acmilangr

Morkai said:


> acmilangr said:
> 
> 
> 
> Does anyone have fixed the washed out colors on HDR games? Devil may cry 5 also have much better colors on SDR. Just look the red logo on start screen. On SDR is red, if you enable HDR it is like orange....
> 
> 
> 
> I have never had that issue, (release-firmware), all hdr content has always looked perfect for me colorwise.
> (netflix hdr, destiny2, battlefield v, anthem, some demos, probably other games i forget - haven't tried dmc5)
Click to expand...

I think the problem is an all games. For example ACO, Battlefield V. you will see that the colors are washed out with HDR. red is like orange and all colors are not deep.


----------



## Morkai

acmilangr said:


> I think the problem is an all games. For example ACO, Battlefield V. you will see that the colors are washed out with HDR. red is like orange and all colors are not deep.


Never happened to me, probably some windows/settings/driver problem on your side?


----------



## deadchip12

My 1-week PG27UQ suddenly develops this problem. Anyone knows what this is? Looks like a local dimming zone is broken.


----------



## Glerox

deadchip12 said:


> My 1-week PG27UQ suddenly develops this problem. Anyone knows what this is? Looks like a local dimming zone is broken.


Wow... not lucky. Yup it seems like an RMA case...


----------



## animeowns

Zenairis said:


> I have to disagree until aliasing is almost not noticeable at all which is near 8k at 27”. That’s still not enough to make the pixels too small for the human eye to see. I’m running a X27 4K 27” and I still want a higher res on a 27” screen. I’m not exactly a fan of ultra wides though.


you can always buy a 5k monitor at 27 inches you can actually get one for about $785 + right now 16:9 and it only needs 1 dp cable to get the full 5k 

https://www.ebay.com/itm/Planar-IX2...120115&hash=item590b7ab76b:g:ez8AAOSwt~1cSJp8


----------



## animeowns

acmilangr said:


> Turn it on first from windows settings, then run the game. You will see then the option on the game.
> 
> But this is another one game like assasins creed with washed out colors when HDR is enabled.....


assassins creed isn't washed out for me with hdr enabled at least origins I haven't tested odyssey yet.


----------



## deadchip12

Glerox said:


> deadchip12 said:
> 
> 
> 
> My 1-week PG27UQ suddenly develops this problem. Anyone knows what this is? Looks like a local dimming zone is broken.
> 
> 
> 
> Wow... not lucky. Yup it seems like an RMA case...
Click to expand...

Asus gave me a new one yesterday. This one has the updated firmware (I don’t care much about 144hz anyway so it doesn’t do much for me), but a little worse backlight bleed and at least 4 dead pixels (not noticeable) and one pretty obvious blue stuck pixel 😞


----------



## Cyber Locc

So I dont think its any fault of the monitor, but I am having issues with Netlfix when 4k/HDR is on, and was hoping someone here may have seen something about this. 

When I turn HDR and 4k on, for the windows 10 Netflix app or for Netflix on Edge. the screen freaks, I tried to SS it, Netflix wont let me but I made a drawing, basically the top and bottom black bars, both go on the bottom, and some of the movie is sometimes in between them, usually a stuck copy of a scene of the movie. It happens after watching HDR for a few mins, or if I try to skip through at all it does it instantly. 

I dont have issues with HDR anywhere else, local movies, Youtube, the HDR test video, all fine, just Netflix is doing this. Oh I have a 2080ti btw.

Edit: doing some testing, it doesnt happen on all Netflixs video either. It happens on Netflix Netflix videos like Bright, Punisher, however Next Gen, and Sex Education do not suffer the same issues.


----------



## acmilangr

assassins creed isn't washed out for me with hdr enabled at least origins I haven't tested odyssey yet.[/QUOTE]
Try forza MOTROSPORT 7, DMC5, assasins creed odessy, anthem metro exodus . All of them have washed out colors.

DMC5 is the best example, just check the red color on start screen of the game and then close HDR (just tap alt-tab). You will see in SDR is real red and with HDR was orange.
The same in all games I mentioned. I have tried everything.4:2:2,4:4:4, 60hz, 98hz


----------



## Talon2016

Glerox said:


> I've tried 12 monitors in the last two years and this is hands down the best gaming monitor out there if you have at least a 1080 TI or unless you do professional gaming.
> I was waiting for the firmware update that what supposed to be user upgradeable but it's not coming out...
> It pissed me off that I couldn't play at 144Hz in Apex Legends (stuck to 120Hz) so I said FU$%? IT, I sent it back to Asus for a firmware update.
> We'll see how it goes... I will do a video review update when I receive it with the new firmware so stay tuned!
> 
> On the negative side, blooming is a bit annoying in dark games but we're at least two years before high frame rate mini LEDs monitors.
> Also the fan is annoying too but that's the price to pay for a premium experience...


What is the firmware update supposed to address?


----------



## acmilangr

Talon2016 said:


> Glerox said:
> 
> 
> 
> I've tried 12 monitors in the last two years and this is hands down the best gaming monitor out there if you have at least a 1080 TI or unless you do professional gaming.
> I was waiting for the firmware update that what supposed to be user upgradeable but it's not coming out...
> It pissed me off that I couldn't play at 144Hz in Apex Legends (stuck to 120Hz) so I said FU$%? IT, I sent it back to Asus for a firmware update.
> We'll see how it goes... I will do a video review update when I receive it with the new firmware so stay tuned!
> 
> On the negative side, blooming is a bit annoying in dark games but we're at least two years before high frame rate mini LEDs monitors.
> Also the fan is annoying too but that's the price to pay for a premium experience...
> 
> 
> 
> What is the firmware update supposed to address?
Click to expand...

Black crush issue at 144hz and about 600nits on SDR.


----------



## Cyber Locc

acmilangr said:


> I think the problem is an all games. For example ACO, Battlefield V. you will see that the colors are washed out with HDR. red is like orange and all colors are not deep.


What settings are you using for SDR? Are you sure the colors are washed out, and your not over saturating the SDR colors? If your not, than something is wrong, because the HDR colors are perfectly on point.

The colors of HDR on the 27UQ are only slightly over saturated, pretty much reference, as compared to my Professionally calibrated PJ, they are pretty much perfect. What you are saying is not correct colors, are most likely actually correct colors, and your experience with them out of HDR are not correct color. 

Try putting the SDR on "Racing" mode, that is supposed to be the best stock calibration, and to my eyes, is pretty close to perfect as well.


----------



## acmilangr

Cyber Locc said:


> acmilangr said:
> 
> 
> 
> I think the problem is an all games. For example ACO, Battlefield V. you will see that the colors are washed out with HDR. red is like orange and all colors are not deep.
> 
> 
> 
> What settings are you using for SDR? Are you sure the colors are washed out, and your not over saturating the SDR colors? If your not, than something is wrong, because the HDR colors are perfectly on point.
> 
> The colors of HDR on the 27UQ are only slightly over saturated, pretty much reference, as compared to my Professionally calibrated PJ, they are pretty much perfect. What you are saying is not correct colors, are most likely actually correct colors, and your experience with them out of HDR are not correct color.
> 
> Try putting the SDR on "Racing" mode, that is supposed to be the best stock calibration, and to my eyes, is pretty close to perfect as well.
Click to expand...

ο
Yes. I am comparing with racing preset. 
Just try devil may cry 5. Look at the red logo on start screen. In SDR is deep red. In HDR is like orange. 

HDR supposed to have wide color gamut but it seems like be SRGB....


----------



## Cyber Locc

acmilangr said:


> ο
> Yes. I am comparing with racing preset.
> Just try devil may cry 5. Look at the red logo on start screen. In SDR is deep red. In HDR is like orange.
> 
> HDR supposed to have wide color gamut but it seems like be SRGB....


I dont have that game, but will probably get soon, and will test.

Can you take pictures? obviously Screens will not work, but a picture would to give us a reference.

Never mind . 

https://www.reddit.com/r/PS4/comments/ao4tgh/psa_turn_off_hdr_for_devil_may_cry_5/, thats the first result and there is hundreds more. The games HDR is bad, there is nothing wrong with your monitor.

I really liked the statement that andantech uses in their review. 

"At the end of the day, not all HDR is made equal, which goes for the game-world and scene construction in addition to HDR support. So although the PG27UQ is up to the task, you may not see the full range of color and brightness translated into a given game, depending on its HDR implementation." 

https://www.anandtech.com/show/13060/asus-pg27uq-gsync-hdr-review/9


No matter how nice the monitor it cant make up for a crappy coded game. 


Also no need to take pics, I already found some, this guy is touting the HDR is great, however I can see in the pics the Red to Orange effect. 

SDR: https://i.imgur.com/xe5zwDp.jpg
HDR: https://i.imgur.com/6l3M57L.jpg

However, like I said, all the reviews have pegged the monitor as Perfect HDR color representation. When they do that, they use software and a color reader that checks the display, the software tells them what color should be seen and the reader the color they are seeing. You can get this equipment yourself and check your monitor/adjust it to see if something is wrong with yours. 
However I dont think anything is wrong, HDR on windows and on games is new, its buggy, and the game devs are lazy. They are not properly coding the HDR, its an afterthought for a tiny percentage of the market. This will improve, in time, but for now it is what it is.

Next time it may be worth a google search before assuming its your monitor. That game has terrible reviews, about its HDR performance on all platforms, on all displays. Its a garbage game for HDR.


----------



## acmilangr

I didn't blame the monitor. I just told the HDR issue.
I think this issue all have but didn't notice most of them . The same happens on battlefield V, assasins creed odyssey, FM7, metro exodus and other... 
I love the pg27uq


----------



## Glerox

So tomorrow it will be exactly 3 weeks since Asus RECEIVED my monitor for a firmware update. I've sent many emails, they only tell me to wait.
Am I wrong to consider this absolutely poor customer service for a firmware update on a 2000$ monitor (also because the downloadable utility tool was never released)?
I want to file a complaint.


----------



## kot0005

Glerox said:


> So tomorrow it will be exactly 3 weeks since Asus RECEIVED my monitor for a firmware update. I've sent many emails, they only tell me to wait.
> Am I wrong to consider this absolutely poor customer service for a firmware update on a 2000$ monitor (also because the downloadable utility tool was never released)?
> I want to file a complaint.


WOW dude. That's absolutely trash service. Not to mention the user flashable firmware is 3 months behind the delivery date. I filed a complaint for mine because I kept the monitor thinking that the firmware updatye would come, I would have returned it and waited for the refreshed batch if I knew..


----------



## Glerox

kot0005 said:


> WOW dude. That's absolutely trash service. Not to mention the user flashable firmware is 3 months behind the delivery date. I filed a complaint for mine because I kept the monitor thinking that the firmware updatye would come, I would have returned it and waited for the refreshed batch if I knew..


Yeah... Ok thanks for the support! It's a shame because it's not like we have a ton of choices lol.


----------



## tinykitten

Glerox said:


> So tomorrow it will be exactly 3 weeks since Asus RECEIVED my monitor for a firmware update. I've sent many emails, they only tell me to wait.
> Am I wrong to consider this absolutely poor customer service for a firmware update on a 2000$ monitor (also because the downloadable utility tool was never released)?
> I want to file a complaint.


Ouch. Three weeks is really pushing it. It's such a ****show for early adopters. Asus support has no idea about what's going on, Asus JJ dodges twitter regarding the utility tool and I'm sure there's much more going on. Fast forward, the tool is never coming out and people have to wait more than three weeks for a firmware update. Unless this gets noticed by a third party with a large audience such as LTT Asus wont care one bit. And I'd assume it's highly unlikely that an influential third party would report about this anyway.


----------



## Glerox

tinykitten said:


> Glerox said:
> 
> 
> 
> So tomorrow it will be exactly 3 weeks since Asus RECEIVED my monitor for a firmware update. I've sent many emails, they only tell me to wait.
> Am I wrong to consider this absolutely poor customer service for a firmware update on a 2000$ monitor (also because the downloadable utility tool was never released)?
> I want to file a complaint.
> 
> 
> 
> Ouch. Three weeks is really pushing it. It's such a ****show for early adopters. Asus support has no idea about what's going on, Asus JJ dodges twitter regarding the utility tool and I'm sure there's much more going on. Fast forward, the tool is never coming out and people have to wait more than three weeks for a firmware update. Unless this gets noticed by a third party with a large audience such as LTT Asus wont care one bit. And I'd assume it's highly unlikely that an influential third party would report about this anyway.
Click to expand...

I know it's terrible... I will do a video on this once I receive it. I have a small PC channel, it's better than nothing I guess.
I filed a complaint today, will see if it changes anything.


----------



## deadchip12

Having owned the monitor for 2 weeks now. I have to say the blooming have been bothering me a lot, and it’s very noticeable even in a brightly lit room with all kind of lightings: bias light, overhead light,... you name it. The relatively high number of dimming zone doesn’t help much. The low native contrast of IPS panel and blb really negatively affect hdr experience. I guess I have to live with it. 

The monitor can be very bright though, and this maybe it’s biggest advantage against oled or other va tvs. I did some measurements and its peak brightness can reach nearly double of my oled c7. Explosions in Uncharted 4 or the sun in HZD are searingly bright it’s brilliant. And there’s no burn-in, and 4k at 27 inch upclose is so sharp it blows 55 inch tv out of the water.

So I tried 55” oled, I tried 27” ips fald, and I’m still not satisfied. The one thing I haven’t tried is a 49” va fald x900f. Maybe it will be the best of both world? But my bank acct is empty now.


----------



## saltedham

the fans is what really annoys me on this monitor. i have to have my case fans cranked up while just browsing to i dont have to hear the constant spinning up and down. would have been nice if asus had made the back fatter to make room to let users change the fan to something quiet.


----------



## Fraizer

any one buy his monitor on 2019 ? to see what version of firmware you have ? compare to us who bought there monitor in june or me in jully with update firmware version ?

thank you for your feedbacks


----------



## Fraizer

deadchip12 said:


> Asus gave me a new one yesterday. This one has the updated firmware (I don’t care much about 144hz anyway so it doesn’t do much for me), but a little worse backlight bleed and at least 4 dead pixels (not noticeable) and one pretty obvious blue stuck pixel 😞


can you please tell us which firmware version you have ? to see if it is an update after the jully update version

thank you man


----------



## deadchip12

Fraizer said:


> deadchip12 said:
> 
> 
> 
> Asus gave me a new one yesterday. This one has the updated firmware (I don’t care much about 144hz anyway so it doesn’t do much for me), but a little worse backlight bleed and at least 4 dead pixels (not noticeable) and one pretty obvious blue stuck pixel 😞
> 
> 
> 
> can you please tell us which firmware version you have ? to see if it is an update after the jully update version
> 
> thank you man
Click to expand...

Hmm how to check the firmware version? I just know it has extra options in the osd to fix the 144hz black crush like what asus promise


----------



## Cyber Locc

deadchip12 said:


> Hmm how to check the firmware version? I just know it has extra options in the osd to fix the 144hz black crush like what asus promise


I dont think we can check the firmware, I looked before. My model defiantly has the firmware update, as the serial shows its the new model. 

Whats the black crush fix setting?

Also to help with the blooming, a lot, if you turn brightness and contrast down it makes it almost go away. I only notice it personally when on a black screen with the mouse, and with my brightness and contrast at 60 ref white, and 40 contrast, and 40 brightness in SDR, I can barely tell even on dark screens with mouse. In movies, I never notice at all. Its IPS glow I think causing it someone said it was the algorithm is bad in another thread, if thats the case it may be fixed in an update. Its also as has been pointed out here on this thread, a per panel issue some are worse than others. 




I dont know if this is a new model change as well, however I dont hear this fan, at all, ever. the only way I hear the fan, is if I put my head behind the monitor and up against it with complete silence in the room. My case fans, which are EK vardars at 1400rpm, are 10x louder than this fan is, I can hear those, easily, and those are not loud or obnoxious just audible. Then with me just pausing the movie my daughter was watching on our PS4, which is 8 feet from me, that thing is louder than my case fans. 

Like I said, I am not sure if dud for the people with issues, or if there was a fan update, but to people buying in 2019 at least, I dont think the fan is an issue at all.


----------



## AngryLobster

Pretty sure the fan(s) has been physically updated as has the fan profile. I went to check out someone else's recently bought PG27UQ and theirs was way quieter than my bros at both idle and load.


----------



## animeowns

Glerox said:


> So tomorrow it will be exactly 3 weeks since Asus RECEIVED my monitor for a firmware update. I've sent many emails, they only tell me to wait.
> Am I wrong to consider this absolutely poor customer service for a firmware update on a 2000$ monitor (also because the downloadable utility tool was never released)?
> I want to file a complaint.


any news on asus has updated your firmware yet I emailed them they said the firmware update can only be done after sending my monitor in. And it would take about 7-10 days to complete it
I don't have any issues with my display I bought back in september of last year the fan is silent mine was manufactured in june though but I might sale my display just to get an ultrawide never tried one and I'm interested in seeing that 49 inch 5120x1440 @ 120hz hdr 1000 nits va


----------



## Glerox

animeowns said:


> any news on asus has updated your firmware yet I emailed them they said the firmware update can only be done after sending my monitor in. And it would take about 7-10 days to complete it
> I don't have any issues with my display I bought back in september of last year the fan is silent mine was manufactured in june though but I might sale my display just to get an ultrawide never tried one and I'm interested in seeing that 49 inch 5120x1440 @ 120hz hdr 1000 nits va


No I've emailed and started to call because I still didn't received it. It's gonna be almost a month. I got to Asus top support level on friday and apparently it should be done monday. Please note that this is Asus Canada and it sucks badly (at one point they thought I had sent a notebook ***?!) but you may have a better service with Asus USA or other countries. It should take normally 7-10 days as you said.
I'm really mad about that but there is nothing I can do apart from stopping buying Asus products...


----------



## hmcindie

AngryLobster said:


> Pretty sure the fan(s) has been physically updated as has the fan profile. I went to check out someone else's recently bought PG27UQ and theirs was way quieter than my bros at both idle and load.


Maybe he is not using 144hz and hdr, those heat it up more.


----------



## acmilangr

I want all the owners to make a test please.
-Better Set the brightness to maximum (I have the latest firmware with the 600nits SDR but I tried on 50% and it doesn't feel different on the issue I am seeing so if you have the old firmware will be OK)
-from OSD select racing mode. 

-from windows settings choose HDR to off. 

Now open this HDR video on browser https://youtu.be/74SZXCQb44s
(I have also on my HDD, there is a site that gives it for free)
Now go to 1:00 and check how deep is the red color on that glass. It seems fine as DCI-P3 100% gives really wide color.

Now close the browser iand open HDR on windows settings. 
Open the video again and go again to 1:00,make sure you chose HDR from YouTube setting option. 

Check that this glass has not deep red as was with HDR Off. It is like orange . 
Why? What happens the wide color gamut? Is this normal?


----------



## Cyber Locc

acmilangr said:


> I want all the owners to make a test please.
> -Better Set the brightness to maximum (I have the latest firmware with the 600nits SDR but I tried on 50% and it doesn't feel different on the issue I am seeing so if you have the old firmware will be OK)
> -from OSD select racing mode.
> 
> -from windows settings choose HDR to off.
> 
> Now open this HDR video on browser https://youtu.be/74SZXCQb44s
> (I have also on my HDD, there is a site that gives it for free)
> Now go to 1:00 and check how deep is the red color on that glass. It seems fine as DCI-P3 100% gives really wide color.
> 
> Now close the browser iand open HDR on windows settings.
> Open the video again and go again to 1:00,make sure you chose HDR from YouTube setting option.
> 
> Check that this glass has not deep red as was with HDR Off. It is like orange .
> Why? What happens the wide color gamut? Is this normal?


Well kind of a bad example dont you think? Molten glass, molten anything is Orange not red, technically its a reddish orange, but its orange not red.

there is a drastic diffrence betwen 50% and 100% to me, its much more glowing hot orange, and less orangish red. You can take a SS of the glass in SDR or HDR, and compare it with the photo. The HDR is brighter, alot brighter, and as someone who has seen melted glass in Real Life, that is accurate the SDR image is not. 

Melted glass is glowing with heat, it becomes a black body, like the sun. It is a source of light, that does not portray in the SDR, it does in HDR. There would not be Shadowing in the melted glass like is seen with the SDR image, as the molten glass is a light source, shadowing is not possible.

The colors are not changed, the properties of the image has changed. Like I said, look at the other aspects, the lighting around the glass, the prism in the bottom glass, the light on the metal rail. I can tell you having been around molten glass, the HDR is alot closer than the SDR, you will never see shadowing on a light source, which molten glass is a light source.


----------



## acmilangr

Cyber Locc said:


> acmilangr said:
> 
> 
> 
> I want all the owners to make a test please.
> -Better Set the brightness to maximum (I have the latest firmware with the 600nits SDR but I tried on 50% and it doesn't feel different on the issue I am seeing so if you have the old firmware will be OK)
> -from OSD select racing mode.
> 
> -from windows settings choose HDR to off.
> 
> Now open this HDR video on browser https://youtu.be/74SZXCQb44s
> (I have also on my HDD, there is a site that gives it for free)
> Now go to 1:00 and check how deep is the red color on that glass. It seems fine as DCI-P3 100% gives really wide color.
> 
> Now close the browser iand open HDR on windows settings.
> Open the video again and go again to 1:00,make sure you chose HDR from YouTube setting option.
> 
> Check that this glass has not deep red as was with HDR Off. It is like orange .
> Why? What happens the wide color gamut? Is this normal?
> 
> 
> 
> Well kind of a bad example dont you think? Molten glass, molten anything is Orange not red, technically its a reddish orange, but its orange not red.
Click to expand...

The point is to compare the SDR with HDR. Is it normal SDR have more deep colors than HDR? the same happens in other scenes also and in other colors. 

ON SDR if you change the wide color gamut to SRGB (on osd) it will become orange.

Edit ; now I saw that you edit your post. 
If this is more accurate then fine. But I am worried if the source try to give real red color and the monitor do that wrong and outputs orange. 
I hope you understand what I Mean


----------



## Cyber Locc

acmilangr said:


> The point is to compare the SDR with HDR. Is it normal SDR have more deep colors than HDR? the same happens in other scenes also and in other colors.
> 
> ON SDR if you change the wide color gamut to SRGB (on osd) it will become orange.



Yes its normal, because the SDR colors you are seeing are not accurate to Real life, they are oversatured and dimmed to what they should be. As I stated, you used a perfect source, do you see how the glass is showing a reflection? Or how there is shadowing? Thats not possible in Real life, Molten Glass is a black body radiator it is radiating heat and light off its surface, the SDR image, the camera has darkened the lighting, and brought out reflections that are there underneath immense lighting, that is not actually the real life vision you will see however. 

As to your question, easy answer,










See how the fish does the same thing, It looks brighter on the HDR side, more of an Orange as you say, That is the point of HDR, as the HDR side is closer to reality.


----------



## acmilangr

Cyber Locc said:


> acmilangr said:
> 
> 
> 
> The point is to compare the SDR with HDR. Is it normal SDR have more deep colors than HDR? the same happens in other scenes also and in other colors.
> 
> ON SDR if you change the wide color gamut to SRGB (on osd) it will become orange.
> 
> 
> 
> 
> 
> Yes its normal, because the SDR colors you are seeing are not accurate to Real life, they are oversatured and dimmed to what they should be. As I stated, you used a perfect source, do you see how the glass is showing a reflection? Or how there is shadowing? Thats not possible in Real life, Molten Glass is a black body radiator it is radiating heat and light off its surface, the SDR image, the camera has darkened the lighting, and brought out reflections that are there underneath immense lighting, that is not actually the real life vision you will see however.
> 
> As to your question, easy answer,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> See how the fish does the same thing, It looks brighter on the HDR side, more of an Orange as you say, That is the point of HDR, as the HDR side is closer to reality.
Click to expand...

Thanks for the answer.


----------



## Glerox

acmilangr said:


> Cyber Locc said:
> 
> 
> 
> 
> 
> acmilangr said:
> 
> 
> 
> I want all the owners to make a test please.
> -Better Set the brightness to maximum (I have the latest firmware with the 600nits SDR but I tried on 50% and it doesn't feel different on the issue I am seeing so if you have the old firmware will be OK)
> -from OSD select racing mode.
> 
> -from windows settings choose HDR to off.
> 
> Now open this HDR video on browser https://youtu.be/74SZXCQb44s
> (I have also on my HDD, there is a site that gives it for free)
> Now go to 1:00 and check how deep is the red color on that glass. It seems fine as DCI-P3 100% gives really wide color.
> 
> Now close the browser iand open HDR on windows settings.
> Open the video again and go again to 1:00,make sure you chose HDR from YouTube setting option.
> 
> Check that this glass has not deep red as was with HDR Off. It is like orange .
> Why? What happens the wide color gamut? Is this normal?
> 
> 
> 
> Well kind of a bad example dont you think? Molten glass, molten anything is Orange not red, technically its a reddish orange, but its orange not red.
> 
> Click to expand...
> 
> The point is to compare the SDR with HDR. Is it normal SDR have more deep colors than HDR? the same happens in other scenes also and in other colors.
> 
> ON SDR if you change the wide color gamut to SRGB (on osd) it will become orange.
> 
> Edit ; now I saw that you edit your post.
> If this is more accurate then fine. But I am worried if the source try to give real red color and the monitor do that wrong and outputs orange.
> I hope you understand what I Mean
Click to expand...

You answered to yourself. In SDR, if you want accurate colors, you have to set it to sRGB. If you chose wide color gamut, it will show unaccurately saturated colors (red in this case) because the source content is always mapped to sRGB in SDR.


----------



## AngryLobster

I bought a PG27UQ to compare to my x27 and am pretty disappointed in it's color uniformity. The entire left side is warm, right side green and it's super obvious browsing the web.

Otherwise it's way superior to my x27 in terms of backlight bleed and IPS glow when both have FALD off for testing. In fact I think it's the best IPS panel I've ever seen in that regard.

EDIT: There is definitely some DSE/vertical bands on this Asus. Looks like it lines up with the FALD grid but it's not visible on my Acer. When panning uniform colors it's pretty obvious (like the sky in AC:Origins).

Also the monitor has a Oct 2018 production date and the fan in comparison is unbearable.


----------



## deadchip12

Maybe I should have gone for a VA panel instead of this ips monitor. The blooming is crazy. Here is a video to demonstrate: https://twitter.com/4everBeAKid/status/1110482879404863488


----------



## AngryLobster

That is the shortcoming of IPS. Like I've said before, IPS and HDR do not belong in the same sentence.

I don't mind the blooming because I've owned FALD TV's for 4 years now and it comes with the territory. It's just more severe with these monitors.


----------



## kot0005

acmilangr said:


> I want all the owners to make a test please.
> -Better Set the brightness to maximum (I have the latest firmware with the 600nits SDR but I tried on 50% and it doesn't feel different on the issue I am seeing so if you have the old firmware will be OK)
> -from OSD select racing mode.
> 
> -from windows settings choose HDR to off.
> 
> Now open this HDR video on browser https://youtu.be/74SZXCQb44s
> (I have also on my HDD, there is a site that gives it for free)
> Now go to 1:00 and check how deep is the red color on that glass. It seems fine as DCI-P3 100% gives really wide color.
> 
> Now close the browser iand open HDR on windows settings.
> Open the video again and go again to 1:00,make sure you chose HDR from YouTube setting option.
> 
> Check that this glass has not deep red as was with HDR Off. It is like orange .
> Why? What happens the wide color gamut? Is this normal?


Still red for me ;/ I have the first firmware.


----------



## acmilangr

kot0005 said:


> acmilangr said:
> 
> 
> 
> I want all the owners to make a test please.
> -Better Set the brightness to maximum (I have the latest firmware with the 600nits SDR but I tried on 50% and it doesn't feel different on the issue I am seeing so if you have the old firmware will be OK)
> -from OSD select racing mode.
> 
> -from windows settings choose HDR to off.
> 
> Now open this HDR video on browser https://youtu.be/74SZXCQb44s
> (I have also on my HDD, there is a site that gives it for free)
> Now go to 1:00 and check how deep is the red color on that glass. It seems fine as DCI-P3 100% gives really wide color.
> 
> Now close the browser iand open HDR on windows settings.
> Open the video again and go again to 1:00,make sure you chose HDR from YouTube setting option.
> 
> Check that this glass has not deep red as was with HDR Off. It is like orange .
> Why? What happens the wide color gamut? Is this normal?
> 
> 
> 
> Still red for me ;/ I have the first firmware.
Click to expand...

Really weird . I was talking with someone else that have oled TV and told me that it was orange in HDR.


----------



## Cyber Locc

AngryLobster said:


> I bought a PG27UQ to compare to my x27 and am pretty disappointed in it's color uniformity. The entire left side is warm, right side green and it's super obvious browsing the web.
> 
> Otherwise it's way superior to my x27 in terms of backlight bleed and IPS glow when both have FALD off for testing. In fact I think it's the best IPS panel I've ever seen in that regard.
> 
> EDIT: There is definitely some DSE/vertical bands on this Asus. Looks like it lines up with the FALD grid but it's not visible on my Acer. When panning uniform colors it's pretty obvious (like the sky in AC:Origins).
> 
> Also the monitor has a Oct 2018 production date and the fan in comparison is unbearable.



Ya I would send that one back and get it replaced with another. My color uniformity does not have that issue, They actually touted the color uniformity as perfect on Andantech. https://www.anandtech.com/show/13060/asus-pg27uq-gsync-hdr-review/10 Of course then you take the chance of more Glow. 

I think we are slowly finding out the same applies to these as the past Swifts and Predators, the panel lottery is still very real , that sucks QC is still not ironed down.


Really sucks the fan thing too, I am glad I held off buying one to new year hit. My model has a January date, and Its fan is silent.


----------



## AngryLobster

Yeah it's going back. It's really only white uniformity that's the issue (warm left, greenish right) and it's completely unnoticeable in content/games but that's a huge flaw for something this expensive.

Tons of IPS glow/bleed but nice uniform whites.

Moderate IPS glow/bleed but vertical bands.

Little IPS glow/bleed but terrible white uniformity.

Add in the fan noise RNG into either of those 3 scenarios and it makes finding a "good" one tough.


----------



## Cyber Locc

AngryLobster said:


> Yeah it's going back. It's really only white uniformity that's the issue (warm left, greenish right) and it's completely unnoticeable in content/games but that's a huge flaw for something this expensive.
> 
> Tons of IPS glow/bleed but nice uniform whites.
> 
> Moderate IPS glow/bleed but vertical bands.
> 
> Little IPS glow/bleed but terrible white uniformity.
> 
> Add in the fan noise RNG into either of those 3 scenarios and it makes finding a "good" one tough.


Well I feel extremely lucky now, and I was especially lucky, as I bought the last one New Egg had, when they had them on sale for 1650 I think it was. And I was like shoot this thing better be damn perfect lol. I was even more scared as the box was a hot mess lol, fedex was not nice to it. I have checked, to see what could have been and from what I have seen Newegg still hasnt got them in Stock lol, they have an Open Box for 1700.


----------



## Glerox

Anyone has problems with BF V? I got my PG27UQ back and the game is just unplayable. Getting 20 FPS in the menu and 50 FPS in the game. Really weird I haven't found the issue yet.


----------



## Glerox

Glerox said:


> Anyone has problems with BF V? I got my PG27UQ back and the game is just unplayable. Getting 20 FPS in the menu and 50 FPS in the game. Really weird I haven't found the issue yet.


Nevermind the game render resolution changed itself to 200%... so basically I was running 8K on a single card lol.
Btw, the firmware update is worth it!

-No more black crush. I can now play Apex in [email protected] 
-No more message pop-up when switching from SDR to HDR and vice-versa
-higher maximum brightness in SDR


----------



## Cyber Locc

Glerox said:


> Glerox said:
> 
> 
> 
> Anyone has problems with BF V? I got my PG27UQ back and the game is just unplayable. Getting 20 FPS in the menu and 50 FPS in the game. Really weird I haven't found the issue yet.
> 
> 
> 
> Nevermind the game render resolution changed itself to 200%... so basically I was running 8K on a single card lol.
> Btw, the firmware update is worth it!
> 
> -No more black crush. I can now play Apex in [email protected] /forum/images/smilies/smile.gif
> -No more message pop-up when switching from SDR to HDR and vice-versa
> -higher maximum brightness in SDR
Click to expand...

What?? I have the new firmware but I still get a message when switching for SDR to HDR.


----------



## acmilangr

There is option on OSD to show hdr notification or not


----------



## kot0005

Glerox said:


> Nevermind the game render resolution changed itself to 200%... so basically I was running 8K on a single card lol.
> Btw, the firmware update is worth it!
> 
> -No more black crush. I can now play Apex in [email protected]
> -No more message pop-up when switching from SDR to HDR and vice-versa
> -higher maximum brightness in SDR


any damage or scratches to the monitor ?


----------



## acmilangr

kot0005 said:


> Glerox said:
> 
> 
> 
> Nevermind the game render resolution changed itself to 200%... so basically I was running 8K on a single card lol.
> Btw, the firmware update is worth it!
> 
> -No more black crush. I can now play Apex in [email protected] /forum/images/smilies/smile.gif
> -No more message pop-up when switching from SDR to HDR and vice-versa
> -higher maximum brightness in SDR
> 
> 
> 
> any damage or scratches to the monitor ?
Click to expand...

Scratches? Why?


----------



## animeowns

Glerox said:


> Anyone has problems with BF V? I got my PG27UQ back and the game is just unplayable. Getting 20 FPS in the menu and 50 FPS in the game. Really weird I haven't found the issue yet.


so did the firmware update fix the black crush was it really worth it sending your monitor in? asking because they gave me the option to send mine end but I don't want what is a good panel damaged by asus.


----------



## animeowns

acmilangr said:


> Scratches? Why?


shipping you have to send the panel to asus for repair.


----------



## acmilangr

animeowns said:


> Glerox said:
> 
> 
> 
> Anyone has problems with BF V? I got my PG27UQ back and the game is just unplayable. Getting 20 FPS in the menu and 50 FPS in the game. Really weird I haven't found the issue yet.
> 
> 
> 
> so did the firmware update fix the black crush was it really worth it sending your monitor in? asking because they gave me the option to send mine end but I don't want what is a good panel damaged by asus.
Click to expand...

It also gives about double brightness on SDR.


----------



## Glerox

animeowns said:


> so did the firmware update fix the black crush was it really worth it sending your monitor in? asking because they gave me the option to send mine end but I don't want what is a good panel damaged by asus.





animeowns said:


> shipping you have to send the panel to asus for repair.





acmilangr said:


> It also gives about double brightness on SDR.


Yes it's worth it but damn, it was a long and painful month to wait for it...
no scratches, no damages on the panel AFAIK

I did a video on the firmware update and my 9 months review of the panel if you'd like :





Youtube is a nice hobby


----------



## animeowns

Glerox said:


> Yes it's worth it but damn, it was a long and painful month to wait for it...
> no scratches, no damages on the panel AFAIK
> 
> I did a video on the firmware update and my 9 months review of the panel if you'd like :
> https://www.youtube.com/watch?v=pqDGGD3Mj74&feature=youtu.be
> 
> Youtube is a nice hobby


thanks for the review after my C9 gets here I will be sending in my pg27uq for the firmware update.


----------



## Glerox

animeowns said:


> thanks for the review after my C9 gets here I will be sending in my pg27uq for the firmware update.


Maybe Asus USA will be faster. The normal turn around time should be 7 to 10 days.


----------



## ggp759

I had the Acer X27 and had to send it back due to excessive blooming in HDR. Can any of the owners please play on their monitor this clip 






in HDR of course and post some pics if its not too much trouble. I want to try the Asus one after the Acer I dont know if its any different.


I played the above video on my Acer with 80nits peak white, fald on and fald response to gaming ( i think on the Asus is fast).

For reference here are the pics of the Acer with that clip. And yes i know i adjusted the exposure of the camera to reflect what i actually see. There's no camera exaggeration. Thanks a lot guys. (Its around the 0:36 mark for the video)


----------



## Seyumi

@ggp759

That does look pretty bad if that's how it looks in person. I'm still glad that I decided to hold off for the 55" Alienware OLED instead of taking the $2000 plunge on these 1st-generation (tiny 27") 4k 144hz FALD displays.


----------



## kot0005

animeowns said:


> shipping you have to send the panel to asus for repair.


Not just shipping, its the technicians that do most of the bad stuff.


----------



## deadchip12

ggp759 said:


> I had the Acer X27 and had to send it back due to excessive blooming in HDR. Can any of the owners please play on their monitor this clip
> 
> 
> 
> 
> 
> 
> in HDR of course and post some pics if its not too much trouble. I want to try the Asus one after the Acer I dont know if its any different.
> 
> 
> I played the above video on my Acer with 80nits peak white, fald on and fald response to gaming ( i think on the Asus is fast).
> 
> For reference here are the pics of the Acer with that clip. And yes i know i adjusted the exposure of the camera to reflect what i actually see. There's no camera exaggeration. Thanks a lot guys. (Its around the 0:36 mark for the video)


It looks like this on my pg27uq (pls see the pics attached). It’s not as super obvious as yours, but still very noticeable. If you think this is bad, check out this video: https://youtu.be/E3Bf3mq1Or8. My monitor looks like a complete mess running it. Feels bad spending $3000 on this man. Should have bought a va tv.


----------



## ggp759

deadchip12 said:


> It looks like this on my pg27uq (pls see the pics attached). It’s not as super obvious as yours, but still very noticeable. If you think this is bad, check out this video: https://youtu.be/E3Bf3mq1Or8. My monitor looks like a complete mess running it. Feels bad spending $3000 on this man. Should have bought a va tv.


Thank you so much for this. This is miles better than mine. Yes i know what you mean with the video that you posted. I watched something similar ( night in vegas with hdr) and it was an absolute disaster. Thinking of getting the Asus now... Is the sounnd from the fan that terrrible?


----------



## deadchip12

ggp759 said:


> deadchip12 said:
> 
> 
> 
> It looks like this on my pg27uq (pls see the pics attached). It’s not as super obvious as yours, but still very noticeable. If you think this is bad, check out this video: https://youtu.be/E3Bf3mq1Or8. My monitor looks like a complete mess running it. Feels bad spending $3000 on this man. Should have bought a va tv.
> 
> 
> 
> Thank you so much for this. This is miles better than mine. Yes i know what you mean with the video that you posted. I watched something similar ( night in vegas with hdr) and it was an absolute disaster. Thinking of getting the Asus now... Is the sounnd from the fan that terrrible?
Click to expand...

I don’t think the fan makes any noise at all lol but i wear headphone and my room is not exactly quiet so I’m not sure (though I did put my ears near the back of the monitor and heard nothing). 

The backlight bleed is pretty bad though (see pics), and I have exchanged once. Both the old and new monitor have backlight bleed and dead/stuck pixels. I’m asking Asus to give me a 3rd one and if it is still like this then I will ask them to return my money


----------



## Lockjaw333

deadchip12 said:


> I don’t think the fan makes any noise at all lol but i wear headphone and my room is not exactly quiet so I’m not sure (though I did put my ears near the back of the monitor and heard nothing).
> 
> The backlight bleed is pretty bad though (see pics), and I have exchanged once. Both the old and new monitor have backlight bleed and dead/stuck pixels. I’m asking Asus to give me a 3rd one and if it is still like this then I will ask them to return my money


That picture looks like mostly IPS glow. Bleed would manifest as like patches of usually a yellowish color that dont change with head movement. For example if you view the monitor on a black screen from an extreme angle they would still be visible- and they are usually at the edges (typically from the bezel being fastened too tightly.

Does turning on FALD eliminate it?

Also is it possible to take a picture of your dead pixel(s)?


----------



## deadchip12

Lockjaw333 said:


> deadchip12 said:
> 
> 
> 
> I don’t think the fan makes any noise at all lol but i wear headphone and my room is not exactly quiet so I’m not sure (though I did put my ears near the back of the monitor and heard nothing).
> 
> The backlight bleed is pretty bad though (see pics), and I have exchanged once. Both the old and new monitor have backlight bleed and dead/stuck pixels. I’m asking Asus to give me a 3rd one and if it is still like this then I will ask them to return my money
> 
> 
> 
> That picture looks like mostly IPS glow. Bleed would manifest as like patches of usually a yellowish color that dont change with head movement. For example if you view the monitor on a black screen from an extreme angle they would still be visible- and they are usually at the edges (typically from the bezel being fastened too tightly.
> 
> Does turning on FALD eliminate it?
> 
> Also is it possible to take a picture of your dead pixel(s)?
Click to expand...

Attached is a picture with fald turned on. Most of the glow/blb are gone but the monitor does not look completely black. The thing is, in real contents, when bright objects are near edges or corners of the monitor, fald cannot help it and the haloing is intensified by those glows and looks horrible.

Can’t take a pic of the dead pixel because they are too small and pretty much unnoticeable. But I attached a pic of the stuck pixel, which is very obvious


----------



## acmilangr

They could (and must) make a clever algorithm to reduce the blooming like many Samsung TV.

When in a black space there is a small area that is not black (like a star) the backlight leds on that area could not bright too much


----------



## Cyber Locc

acmilangr said:


> They could (and must) make a clever algorithm to reduce the blooming like many Samsung TV.
> 
> When in a black space there is a small area that is not black (like a star) the backlight leds on that area could not bright too much


Most the Samsung TVs don't have some superior algorithm, they just don't get nearly as bright, and they have VA panels. We have a Sammy in the bedroom, that doesn't bloom that bad, it's also 600nits peak brightness, so like half the monitor lol. 

Blooming is unavoidable with FALD, the brighter the lights, the worse it will be, IPS makes it a tad worse as well. 

The only way you are not going to have to deal with blooming is buying LED. 

To help manage it, turn the brightness down, that's about all you can do. If it still bothers you, try a TV, but I think you will find it has similar issues, and it's own issues to add.


----------



## deadchip12

Cyber Locc said:


> acmilangr said:
> 
> 
> 
> They could (and must) make a clever algorithm to reduce the blooming like many Samsung TV.
> 
> When in a black space there is a small area that is not black (like a star) the backlight leds on that area could not bright too much
> 
> 
> 
> Most the Samsung TVs don't have some superior algorithm, they just don't get nearly as bright, and they have VA panels. We have a Sammy in the bedroom, that doesn't bloom that bad, it's also 600nits peak brightness, so like half the monitor lol.
> 
> Blooming is unavoidable with FALD, the brighter the lights, the worse it will be, IPS makes it a tad worse as well.
> 
> The only way you are not going to have to deal with blooming is buying LED.
> 
> To help manage it, turn the brightness down, that's about all you can do. If it still bothers you, try a TV, but I think you will find it has similar issues, and it's own issues to add.
Click to expand...

Yeah some samsung tvs have very agressive dimming algorithm that completely dims small bright highlights and crushes details. However, I considered a sony va tvs instead, like the x900f. That one has only 40 zones but 5x the contrast ratio compared to this monitor and sony’s dimming algorithm seems to receive praises so blooming maybe less noticeable. I can’t return the monitor though, and no one in my area will want to buy such an expensive monitor if I ever resell it. So I guess I’m stuck with this for now


----------



## Cyber Locc

deadchip12 said:


> Yeah some samsung tvs have very agressive dimming algorithm that completely dims small bright highlights and crushes details. However, I considered a sony va tvs instead, like the x900f. That one has only 40 zones but 5x the contrast ratio compared to this monitor and sony’s dimming algorithm seems to receive praises so blooming maybe less noticeable. I can’t return the monitor though, and no one in my area will want to buy such an expensive monitor if I ever resell it. So I guess I’m stuck with this for now



Ya I dont know why people think this, https://youtu.be/VJwub9mnhuw?t=168 Its going to bloom even worse tbh, maybe not as bright of a bloom, but a larger blooming area. Its the nature of FALD, you are not going to escape it. Look at that vid with fixed backlight, the blooming from the square takes up half the screen lol.

To make thing even worse, look at that bloom trail. You guys really need to go to a TV store and look at these FALDs, they bloom just as bad or worse than our monitors. IF you dont want blooming, Buy OLED, that is simply your only option, no algorithm is going to remove it.


----------



## ToTheSun!

1000+ zone miniLED VA might be pretty good with blooming, to be fair.


----------



## Cyber Locc

ToTheSun! said:


> 1000+ zone miniLED VA might be pretty good with blooming, to be fair.


It likely would, does it exist? No, because its extremely cost prohibitive, and they cant scale it to a reasonable price yet.

You also have to specify where the 1000, like what size monitor. I attached a pic to visualize. Those are both, 40 leds. 

Stating "This TV/Monitor has 1000 LEDS!" is irrelevant information outside of comparing it to a panel with the same size, and how many LEDs it has. 


A 384 Zone, 27 inch VA monitor, may improve blooming a little bit, but not by that much. 

a 1000 zone 65 inch TVs Local dimming will not be superior to a 384 zone 27 inch screen. The entire back of the screen must be lit. 


There are 310 about 1 inch squares in a 27 inch monitor, a 65inch TV has 1808, which means a 65 inch TV would need 2260 LEDs to even match the monitor on a blooming radius standpoint, assuming both were IPS. You need to realize that just like with Pixels, the Amount of Pixels you have is irrelevant, its how many Pixels per Inch, that matters. Same here, how many LEDs the screen has is irrelevant, how many LEDs does it have per Inch. 


So to put it in a realistic expectation, when you hover your mouse on a black screen on the monitor, you have a bloom around it of lighter blacks, by an inch, maybe inch and a half if over 1 led. Do that on that TV, and watch their be a 5 inch circle around the mouse cursor.


----------



## ToTheSun!

Cyber Locc said:


> It likely would, does it exist? No, because its extremely cost prohibitive, and they cant scale it to a reasonable price yet.
> 
> You also have to specify where the 1000, like what size monitor. I attached a pic to visualize. Those are both, 40 leds.
> 
> Stating "This TV/Monitor has 1000 LEDS!" is irrelevant information outside of comparing it to a panel with the same size, and how many LEDs it has.
> 
> 
> A 384 Zone, 27 inch VA monitor, may improve blooming a little bit, but not by that much.
> 
> a 1000 zone 65 inch TVs Local dimming will not be superior to a 384 zone 27 inch screen. The entire back of the screen must be lit.
> 
> 
> There are 310 about 1 inch squares in a 27 inch monitor, a 65inch TV has 1808, which means a 65 inch TV would need 2260 LEDs to even match the monitor on a blooming radius standpoint, assuming both were IPS. You need to realize that just like with Pixels, the Amount of Pixels you have is irrelevant, its how many Pixels per Inch, that matters. Same here, how many LEDs the screen has is irrelevant, how many LEDs does it have per Inch.
> 
> 
> So to put it in a realistic expectation, when you hover your mouse on a black screen on the monitor, you have a bloom around it of lighter blacks, by an inch, maybe inch and a half if over 1 led. Do that on that TV, and watch their be a 5 inch circle around the mouse cursor.


Well, based on what I've just read, you probably think PPI is a good metric.

Zones are zones, regardless of display size. Big display, big distance. Small display, small distance. Trigonometry scales size perception for both the display and the LED zone sizes. That's not a real point to discuss.

Moving on, in regard to availability, AUO already announced 1000+ zone miniLED display for the near future. I mentioned this in response to your claim that one MUST go for OLED if one wants virtually bloom-free presentation. In theory, only emissive displays can achieve that, but such a 1000+ zone VA display with good native contrast might get close enough in practice.


----------



## Cyber Locc

ToTheSun! said:


> Well, based on what I've just read, you probably think PPI is a good metric.
> 
> Zones are zones, regardless of display size. Big display, big distance. Small display, small distance. Trigonometry scales size perception for both the display and the LED zone sizes. That's not a real point to discuss.
> 
> Moving on, in regard to availability, AUO already announced 1000+ zone miniLED display for the near future. I mentioned this in response to your claim that one MUST go for OLED if one wants virtually bloom-free presentation. In theory, only emissive displays can achieve that, but such a 1000+ zone VA display with good native contrast might get close enough in practice.


PPI is the only metric worth a dang, until you get into diminished returns due to distance from the screen. The same applies here, if you are not okay with a 1inch bloom, why would you be okay with a 6 inch bloom. 

Of course, distance from screen plays a role, but are we sitting 10 ft from our displays? No we are not. 

The issue with your analogy of big display big distance, assumes the user of a PC will be at a larger distance due to the increased display size, they are not. People with 65" monitors are still 3ft away from the screen. 


We have a "Faux" 4k Projector, and you are hard pressed to tell the difference between 4k and 1080p, when your sitting 10ft from a 130inch screen, however when you are sitting 5ft away you can tell. Which goes in practice to your reply, however when dealing with a TV as a monitor, and your the same 3 feet away your theorizing is lost, as the distance is not scaling with the size. Your theory basis on the premise that end users, know or care how far they should sit from a given display size, they dont and they wont rearrange their house for it to be right. 


That said, the 1000 zones assuming the same size and distance, will be superior and reducing the blooming effect. in 3-4 years when those come out we can see, however Today, which my reply to him was based on, these monitors as a desk monitor, are going to be the least blooming we can get. And a TV outside of OLED is not going to remove the issue of blooming, nor will the MLEDs, they will just reduce it some. Blooming will be a thing, until Every pixel is lit, IE OLED, any amount of Zones, will just reduce the effect it will never remove it.


----------



## ToTheSun!

Cyber Locc said:


> PPI is the only metric worth a dang [...] The issue with your analogy of big display big distance, assumes the user of a PC will be at a larger distance due to the increased display size, they are not. People with 65" monitors are still 3ft away from the screen.


Well, someone has to give chiropractors a job.



Cyber Locc said:


> in 3-4 years


One of the displays with 1000+ zones was shown at CES this year.


----------



## deadchip12

Cyber Locc said:


> https://youtu.be/VJwub9mnhuw?t=168Its going to bloom even worse tbh, maybe not as bright of a bloom, but a larger blooming area.


“Not as bright” is the keyword here. With some bias lighting, that bloom in the x900f may not even be noticeable (from users and pro reviews that I read). While on this monitor, it’s still very noticeable whatever light I throw at it. Poor native contrast of ips panels really hurts this. Number of zones is just one factor, the others being native contrast and dimming algorithm.


----------



## Cyber Locc

deadchip12 said:


> Cyber Locc said:
> 
> 
> 
> https://youtu.be/VJwub9mnhuw?t=168Its going to bloom even worse tbh, maybe not as bright of a bloom, but a larger blooming area.
> 
> 
> 
> “Not as bright” is the keyword here. With some bias lighting, that bloom in the x900f may not even be noticeable (from users and pro reviews that I read). While on this monitor, it’s still very noticeable whatever light I throw at it. Poor native contrast of ips panels really hurts this. Number of zones is just one factor, the others being native contrast and dimming algorithm.
Click to expand...

Okay man, you are going to think what you want I see. Good luck, I hope you enjoy whatever you decide. 

Going to tell you 1 more time to hopefully help though. That TV and most TVs are going to bloom much worse, I don't care what reviewers supposedly say, your reading the wrong ones. 

I own the last year's model of that TV, my step brother owns that one, and I like to look at TVs at Best Buy lol. 

I'm telling you, go to Best Buy or Walmart or something, switch the input, open the white menu on a black screen. The Blooming is worse. 


"It's not even noticeable" it's not noticeable on this monitor either, except for very dark screens, you are looking at content you chose to show it, that is worse case scenario, that TV will be even worse in said worse case. 

Don't listen to me though, see for yourself here is another vid or go look at them. 
https://youtu.be/dlAM9IUbk30

Barely noticeable, ya if your blind. 



ToTheSun! said:


> Cyber Locc said:
> 
> 
> 
> PPI is the only metric worth a dang [...] The issue with your analogy of big display big distance, assumes the user of a PC will be at a larger distance due to the increased display size, they are not. People with 65" monitors are still 3ft away from the screen.
> 
> 
> 
> Well, someone has to give chiropractors a job.
> 
> 
> 
> Cyber Locc said:
> 
> 
> 
> in 3-4 years
> 
> Click to expand...
> 
> One of the displays with 1000+ zones was shown at CES this year.
Click to expand...

True lol. 


Ya, that's cool it was shown at CES, so was this monitor and now it's out, 3 years later. Showing something at CES doesn't mean it's coming out soon, or even ever.

I would say what's the release date, but this monitor got a release date as well, that kept getting pushed.

Well okay, announced 2016, at CES January 2017, Released August 2018, I wouldn't hold my breath about seeing these 1000 zones any time soon.


----------



## deadchip12

Cyber Locc said:


> deadchip12 said:
> 
> 
> 
> 
> 
> Cyber Locc said:
> 
> 
> 
> https://youtu.be/VJwub9mnhuw?t=168Its going to bloom even worse tbh, maybe not as bright of a bloom, but a larger blooming area.
> 
> 
> 
> “Not as bright” is the keyword here. With some bias lighting, that bloom in the x900f may not even be noticeable (from users and pro reviews that I read). While on this monitor, it’s still very noticeable whatever light I throw at it. Poor native contrast of ips panels really hurts this. Number of zones is just one factor, the others being native contrast and dimming algorithm.
> 
> Click to expand...
> 
> Okay man, you are going to think what you want I see. Good luck, I hope you enjoy whatever you decide.
> 
> Going to tell you 1 more time to hopefully help though. That TV and most TVs are going to bloom much worse, I don't care what reviewers supposedly say, your reading the wrong ones.
> 
> I own the last year's model of that TV, my step brother owns that one, and I like to look at TVs at Best Buy lol.
> 
> I'm telling you, go to Best Buy or Walmart or something, switch the input, open the white menu on a black screen. The Blooming is worse.
> 
> 
> "It's not even noticeable" it's not noticeable on this monitor either, except for very dark screens, you are looking at content you chose to show it, that is worse case scenario, that TV will be even worse in said worse case.
Click to expand...

I asked Rtings and attached is what they told me. Also, they gave the x27 (and presumably the pg27uq as well since they are similar) very low score for contrast and local dimming feature compared to x900f. Since you cite rtings’s videos, I guess you think they are a trustworthy source.

The blooming on the x900f we see in the video is filmed in a pitch dark room and at an angle which is a weakness of va tech. In pitch dark room, this monitor does not even completely turns the zones off, so we end up with a grey full screen compared to a pretty much black full screen on the x900f as seen in the video, so comparison is off. I’m just trying to look for the right answer here man, not think what I want to see.


----------



## Cyber Locc

deadchip12 said:


> I asked Rtings and attached is what they told me. Also, they gave the x27 (and presumably the pg27uq as well since they are similar) very low score for contrast and local dimming feature compared to x900f. Since you cite rtings’s videos, I guess you think they are a trustworthy source.
> 
> The blooming on the x900f we see in the video is filmed in a pitch dark room and at an angle which is a weakness of va tech. In pitch dark room, this monitor does not even completely turns the zones off, so we end up with a grey full screen compared to a pretty much black full screen on the x900f as seen in the video, so comparison is off. I’m just trying to look for the right answer here man, not think what I want to see.



So I have nothing for or against RTratings, I neither like them nor dislike them. I will however say that this right here, 



 thats their test video, and running it on my monitor, looks absolutely nothing like it does in their video of the X27, and watching it on the TVs in the house (a Sony 900e and a Samsung 7100) the TVs are worse, not brighter blooms, just huge blooms. The TVs are bigger, so that is an argument you could make, but the blooms are proportionally larger. 

As to their video and my reality, well the mid size circle in their videos cover, is blooming like 1/4 the screen right? Ya well its an inch around circle tops for me, if even that much, the biggest circle has almost no blooming for me. Thats why I keep saying go look at it in person, their video of the x27, is not even on the same realm of existence as the blooming my Asus has, not by a mile, if it looked like that I would have sent it back immediately. 

I also dont have my monitor set to eye bleed brightness either though, I have it set to 40, but thats the ASUS (new Asus firmware with the brightness increase), yours may differ (40 in SDR, HDR I leave at reference 80)

Edit: watched that review video again, dont have to go past the first 5 secs. that huge bloom barely blooms for me, it blooms their whole screen. Something was seriously wrong with their testing.


----------



## deadchip12

Cyber Locc said:


> deadchip12 said:
> 
> 
> 
> I asked Rtings and attached is what they told me. Also, they gave the x27 (and presumably the pg27uq as well since they are similar) very low score for contrast and local dimming feature compared to x900f. Since you cite rtings’s videos, I guess you think they are a trustworthy source.
> 
> The blooming on the x900f we see in the video is filmed in a pitch dark room and at an angle which is a weakness of va tech. In pitch dark room, this monitor does not even completely turns the zones off, so we end up with a grey full screen compared to a pretty much black full screen on the x900f as seen in the video, so comparison is off. I’m just trying to look for the right answer here man, not think what I want to see.
> 
> 
> 
> 
> So I have nothing for or against RTratings, I neither like them nor dislike them. I will however say that this right here,
> 
> 
> 
> thats their test video, and running it on my monitor, looks absolutely nothing like it does in their video of the X27, and watching it on the TVs in the house (a Sony 900e and a Samsung 7100) the TVs are worse, not brighter blooms, just huge blooms. The TVs are bigger, so that is an argument you could make, but the blooms are proportionally larger.
> 
> As to their video and my reality, well the mid size circle in their videos cover, is blooming like 1/4 the screen right? Ya well its an inch around circle tops for me, if even that much, the biggest circle has almost no blooming for me. Thats why I keep saying go look at it in person, their video of the x27, is not even on the same realm of existence as the blooming my Asus has, not by a mile, if it looked like that I would have sent it back immediately.
> 
> I also dont have my monitor set to eye bleed brightness either though, I have it set to 40, but thats the ASUS (new Asus firmware with the brightness increase), yours may differ (40 in SDR, HDR I leave at reference 80)
> 
> Edit: watched that review video again, dont have to go past the first 5 secs. that huge bloom barely blooms for me, it blooms their whole screen. Something was seriously wrong with their testing.
Click to expand...

Sorry I forgot to attach the screenshot of what they told me regarding x27 vs x900f blooming. Pls see it below.

It’s hard to observe the blooming at the showroom because it’s too bright there. I will take your words that the blooming is worse on the tv since you said you owned a x900e. However, I did ask another user who has the x900f and from the pics he showed me the blooming is much less on his tv. Wish I could have my hands on one to test.


----------



## MistaSparkul

If you're going to do test patterns like that then of course you will see blooming, and we can go cherry pick a whole bunch of HDR videos that will show obvious blooming too. But as someone who has used this monitor to do gaming on for almost a year now, the blooming has for the most part, been a non issue for me. It's barely noticeable in SDR mode where my peak white is only 120 nits so I'm not blasting 1000 nits against black, and in HDR games it has only been a problem in super dark games like Resident Evil 7 and 2 Remake so I resort to my B7 OLED for those games as they are slower anyways so the 60Hz is perfectly fine for it. I just started Far Cry New Dawn and just like Far Cry 5 and AC:O I have not had any problems with crazy blooming. Instead of going around and doing test patterns to expose the monitor's weak point, just try actually using it for a bit and see if the blooming is a real problem for you on actual use case scenarios.


----------



## deadchip12

MistaSparkul said:


> If you're going to do test patterns like that then of course you will see blooming, and we can go cherry pick a whole bunch of HDR videos that will show obvious blooming too. But as someone who has used this monitor to do gaming on for almost a year now, the blooming has for the most part, been a non issue for me. It's barely noticeable in SDR mode where my peak white is only 120 nits so I'm not blasting 1000 nits against black, and in HDR games it has only been a problem in super dark games like Resident Evil 7 and 2 Remake so I resort to my B7 OLED for those games as they are slower anyways so the 60Hz is perfectly fine for it. I just started Far Cry New Dawn and just like Far Cry 5 and AC:O I have not had any problems with crazy blooming. Instead of going around and doing test patterns to expose the monitor's weak point, just try actually using it for a bit and see if the blooming is a real problem for you on actual use case scenarios.


I notice blooming in real contents quite often, that’s why I post about it. Yes only in dark scenes, but it bothers me enough that makes me wonder whether a VA panel with 5x contrast ratio, less zones but 1/3 of the price is better. Attached are 2 pictures of Daredevil S02 to demonstrate. I also play Horizon Zero Dawn. The bright scenes are amazing, far brighter and vibrant than my oled. But when night comes, one you hit the machine, the sparks that come out of it light up the whole screen due to blooming and it looks horrible. Also, any hud near the corners of the screen invite ips glow/blb and all the details there are lost. Pretty annoying.


----------



## MistaSparkul

Ah I see. I don't watch any shows on my monitor so I guess my use case scenario is different as it's limited to gaming only while my B7 OLED handles TV/Movies.


----------



## Cyber Locc

deadchip12 said:


> MistaSparkul said:
> 
> 
> 
> If you're going to do test patterns like that then of course you will see blooming, and we can go cherry pick a whole bunch of HDR videos that will show obvious blooming too. But as someone who has used this monitor to do gaming on for almost a year now, the blooming has for the most part, been a non issue for me. It's barely noticeable in SDR mode where my peak white is only 120 nits so I'm not blasting 1000 nits against black, and in HDR games it has only been a problem in super dark games like Resident Evil 7 and 2 Remake so I resort to my B7 OLED for those games as they are slower anyways so the 60Hz is perfectly fine for it. I just started Far Cry New Dawn and just like Far Cry 5 and AC:O I have not had any problems with crazy blooming. Instead of going around and doing test patterns to expose the monitor's weak point, just try actually using it for a bit and see if the blooming is a real problem for you on actual use case scenarios.
> 
> 
> 
> I notice blooming in real contents quite often, that’s why I post about it. Yes only in dark scenes, but it bothers me enough that makes me wonder whether a VA panel with 5x contrast ratio, less zones but 1/3 of the price is better. Attached are 2 pictures of Daredevil S02 to demonstrate. I also play Horizon Zero Dawn. The bright scenes are amazing, far brighter and vibrant than my oled. But when night comes, one you hit the machine, the sparks that come out of it light up the whole screen due to blooming and it looks horrible. Also, any hud near the corners of the screen invite ips glow/blb and all the details there are lost. Pretty annoying.
Click to expand...

So I don't know what the second pic is, but the first one, the lamp blooming, but it would do that in real life, there would be light around it.


----------



## AngryLobster

Has anyone spotted scan lines on these? Friend I sold my monitor to had me come over and sure enough there is faint whitish cross hatching that isn't uniform across the screen visible in Sekiro. It will appear 3-4 inch sections of the screen.

It comes and goes depending on the scene but it's only really visible when panning the camera against the sky *sometimes*

FYI this is @ 120hz/HDR.

Is what I'm describing the Gsync scanlines people always complain about?


----------



## animeowns

ggp759 said:


> I had the Acer X27 and had to send it back due to excessive blooming in HDR. Can any of the owners please play on their monitor this clip
> 
> 
> in HDR of course and post some pics if its not too much trouble. I want to try the Asus one after the Acer I dont know if its any different.
> 
> 
> I played the above video on my Acer with 80nits peak white, fald on and fald response to gaming ( i think on the Asus is fast).
> 
> For reference here are the pics of the Acer with that clip. And yes i know i adjusted the exposure of the camera to reflect what i actually see. There's no camera exaggeration. Thanks a lot guys. (Its around the 0:36 mark for the video)



wow that looks horrible. I don't have that issue on my pg27uq


----------



## ggp759

animeowns said:


> wow that looks horrible. I don't have that issue on my pg27uq


Can you post pictures from that video? I posted the link a few posts back.


----------



## hmcindie

Cyber Locc said:


> There are 310 about 1 inch squares in a 27 inch monitor, a 65inch TV has 1808, which means a 65 inch TV would need 2260 LEDs to even match the monitor on a blooming radius standpoint, assuming both were IPS. You need to realize that just like with Pixels, the Amount of Pixels you have is irrelevant, its how many Pixels per Inch, that matters. Same here, how many LEDs the screen has is irrelevant, how many LEDs does it have per Inch.


According to you a 1inch monitor with a two-zone backlighting is better than a 65inch tv with more zones as the 1inch tv has smaller zones. That's not how it works.


----------



## animeowns

ggp759 said:


> Can you post pictures from that video? I posted the link a few posts back.


Update I sold my pg27uq while it has nice picture quality and all oled has spoiled me and going from a 55 inch to a 27 is a hard pill to swallow 27 is too small for 4k I hate where I have to use scaling in windows just to read text I'll wait on best buy to get the LG C9 series in they have a warranty that covers burn in so I will buy my tv from them


----------



## bmgjet

AngryLobster said:


> Has anyone spotted scan lines on these? Friend I sold my monitor to had me come over and sure enough there is faint whitish cross hatching that isn't uniform across the screen visible in Sekiro. It will appear 3-4 inch sections of the screen.
> 
> It comes and goes depending on the scene but it's only really visible when panning the camera against the sky *sometimes*
> 
> FYI this is @ 120hz/HDR.
> 
> Is what I'm describing the Gsync scanlines people always complain about?


HDR maxes out the DP connector at 98hz so from there on is considered a overclock for the cord and connectors.


----------



## mattxx88

i'm in! tomorrow i'll have hands on mine


----------



## CallsignVega

animeowns said:


> Update I sold my pg27uq while it has nice picture quality and all oled has spoiled me and going from a 55 inch to a 27 is a hard pill to swallow 27 is too small for 4k I hate where I have to use scaling in windows just to read text I'll wait on best buy to get the LG C9 series in they have a warranty that covers burn in so I will buy my tv from them


Yup, how many more years do we have to wait for a 32" high refresh 4K display...


----------



## Lockjaw333

Does anyone who owns the monitor think the HDR is too bright for everyday use? I've seen varying impressions on whether or not its just simply too bright for a monitor sitting a few feet from you. About to order one of these and that is my main concern right now...I'm someone who calibrates my monitors with a colorimeter to 140 nits peak in a normally lit room.


----------



## Cyber Locc

hmcindie said:


> Cyber Locc said:
> 
> 
> 
> There are 310 about 1 inch squares in a 27 inch monitor, a 65inch TV has 1808, which means a 65 inch TV would need 2260 LEDs to even match the monitor on a blooming radius standpoint, assuming both were IPS. You need to realize that just like with Pixels, the Amount of Pixels you have is irrelevant, its how many Pixels per Inch, that matters. Same here, how many LEDs the screen has is irrelevant, how many LEDs does it have per Inch.
> 
> 
> 
> According to you a 1inch monitor with a two-zone backlighting is better than a 65inch tv with more zones as the 1inch tv has smaller zones. That's not how it works.
Click to expand...

That is exactly how it works lol...

More LEDs, less area lit per led, blacker blacks. 

When the LED has to light a large area, the blacks in that large area are brightened. 

I don't understand how you guys don't get this, it's honestly common sense. 


This is why OLED is wanted, each pixel is an LED. This means the lighting can be handled on a per pixel basis. This is best case scenario. 

However, as a close to, would be a very high amount of Zones with a Non bleed LCD type, such as VA, IPS glows, so the light Carry's more than a VA, which is why the blooming on our 27UQs is bigger than the 1inch the LED lights, that's the IPS screen glow, spreading the light a bit more. 

However even then, the difference is still vast. If a light has to light, 1 in area, or rather is designed to, then we have another light that's designed to light a 6 inch area. 

Now let's say, in the center of the areas is a white circle, small, bright. 

In the case of the 1inch led area, we will see brightened blacks, 1 inch area around the circle. In the case of the 6 inch area, we will see a brightened black 6 inch area around the circle.

In what world do you live in where the later is superior? 

OLED wins due to sheer amount of LEDs per the size area. That's what is important. With the case of OLED the amount of LEDs vs Size of screen doesn't matter, as each pixel is lit. However in an LCD, the amount of LEDs relative to the screen size is the most important metric as far as HDR is concerned.


----------



## Glerox

Lockjaw333 said:


> Does anyone who owns the monitor think the HDR is too bright for everyday use? I've seen varying impressions on whether or not its just simply too bright for a monitor sitting a few feet from you. About to order one of these and that is my main concern right now...I'm someone who calibrates my monitors with a colorimeter to 140 nits peak in a normally lit room.


Im not sure I understand you question, you can adjust the brightness in Windows settings when in HDR mode.


----------



## Lockjaw333

Glerox said:


> Im not sure I understand you question, you can adjust the brightness in Windows settings when in HDR mode.


I mean playing games in HDR with 1000 nit peak brightness. Does it ever get uncomfortable for you?

For example I play a lot of BF V. I'm imagining playing on a map like Narvik with the white snow blaring my eyeballs.


----------



## animeowns

CallsignVega said:


> Yup, how many more years do we have to wait for a 32" high refresh 4K display...


acer just announced Predator CG437K P 43 inch 4k 144hz hdr 1000 nits but its a VA panel I have never used VA but the size is big enough to where I could consider it.


----------



## deadchip12

animeowns said:


> CallsignVega said:
> 
> 
> 
> Yup, how many more years do we have to wait for a 32" high refresh 4K display...
> 
> 
> 
> acer just announced Predator CG437K P 43 inch 4k 144hz hdr 1000 nits but its a VA panel I have never used VA but the size is big enough to where I could consider it.
Click to expand...

VA is supposedly better at hdr (higher native contrast, less blooming) but looking at the price only $1200 it probably is only edge lit, no fald. And I’m not sure about the viewing angle issue when viewing up close


----------



## deadchip12

Lockjaw333 said:


> Glerox said:
> 
> 
> 
> Im not sure I understand you question, you can adjust the brightness in Windows settings when in HDR mode.
> 
> 
> 
> I mean playing games in HDR with 1000 nit peak brightness. Does it ever get uncomfortable for you?
> 
> For example I play a lot of BF V. I'm imagining playing on a map like Narvik with the white snow blaring my eyeballs.
Click to expand...

1000 nits is amazing. But that’s just me.


----------



## Cyber Locc

animeowns said:


> acer just announced Predator CG437K P 43 inch 4k 144hz hdr 1000 nits but its a VA panel I have never used VA but the size is big enough to where I could consider it.



Deadchip already told ya, but figured I would add that is edge lit. Acer has been adding a few non Fald models to their lineup of 4k144hz, which is good for people on more of a budget, but its vastly inferior to the 27s with fald.




Lockjaw333 said:


> I mean playing games in HDR with 1000 nit peak brightness. Does it ever get uncomfortable for you?
> 
> For example I play a lot of BF V. I'm imagining playing on a map like Narvik with the white snow blaring my eyeballs.


It can at times be bothersome, after a long day in a dark room its rather bright. 

You can adjust the HDR brightness, is the answer I think you were mostly wanting however. Im not sure about in windows, but the monitors menu has a setting for HDR max Brightness, not only can you decrease it, you can also increase it.


----------



## animeowns

Cyber Locc said:


> Deadchip already told ya, but figured I would add that is edge lit. Acer has been adding a few non Fald models to their lineup of 4k144hz, which is good for people on more of a budget, but its vastly inferior to the 27s with fald.
> 
> 
> 
> 
> It can at times be bothersome, after a long day in a dark room its rather bright.
> 
> You can adjust the HDR brightness, is the answer I think you were mostly wanting however. Im not sure about in windows, but the monitors menu has a setting for HDR max Brightness, not only can you decrease it, you can also increase it.


that's not a problem I just want a good quality screen that is big enough for gaming @ 4k and 120hz and at 27 its too small to see 4k in all its glory. I might just grab that monitor and a lg c9 oled tv even 32 inches @ 4k is too small but I don't like the way acer is releasing a 4k 120hz 32 inch with only hdr400 vs the 1000


----------



## deadchip12

Ok I just tried Assassin's Creed Origins on this monitor. Does anyone feel the game's picture feels a bit washed out? Like black areas are grey. I remember being in a cave and the whole screen is grey instead of black. Also, in hdr mode, there's this graphics glitch where sparks appear around characters. Not an issue in sdr.


----------



## acmilangr

deadchip12 said:


> Ok I just tried Assassin's Creed Origins on this monitor. Does anyone feel the game's picture feels a bit washed out? Like black areas are grey. I remember being in a cave and the whole screen is grey instead of black. Also, in hdr mode, there's this graphics glitch where sparks appear around characters. Not an issue in sdr.


Yes. HDR on this game is crap.


----------



## MistaSparkul

acmilangr said:


> deadchip12 said:
> 
> 
> 
> Ok I just tried Assassin's Creed Origins on this monitor. Does anyone feel the game's picture feels a bit washed out? Like black areas are grey. I remember being in a cave and the whole screen is grey instead of black. Also, in hdr mode, there's this graphics glitch where sparks appear around characters. Not an issue in sdr.
> 
> 
> 
> Yes. HDR on this game is crap.
Click to expand...

Don't use msi afterburner or RTSS as it messes with the game's HDR. Without it the game's HDR looks just fine.


----------



## acmilangr

MistaSparkul said:


> acmilangr said:
> 
> 
> 
> 
> 
> deadchip12 said:
> 
> 
> 
> Ok I just tried Assassin's Creed Origins on this monitor. Does anyone feel the game's picture feels a bit washed out? Like black areas are grey. I remember being in a cave and the whole screen is grey instead of black. Also, in hdr mode, there's this graphics glitch where sparks appear around characters. Not an issue in sdr.
> 
> 
> 
> Yes. HDR on this game is crap.
> 
> Click to expand...
> 
> Don't use msi afterburner or RTSS as it messes with the game's HDR. Without it the game's HDR looks just fine.
Click to expand...

No it doesn't. Just try it with HDR off and you will see more vibrant colors. Red is red and not orange


----------



## MistaSparkul

acmilangr said:


> MistaSparkul said:
> 
> 
> 
> 
> 
> acmilangr said:
> 
> 
> 
> 
> 
> deadchip12 said:
> 
> 
> 
> Ok I just tried Assassin's Creed Origins on this monitor. Does anyone feel the game's picture feels a bit washed out? Like black areas are grey. I remember being in a cave and the whole screen is grey instead of black. Also, in hdr mode, there's this graphics glitch where sparks appear around characters. Not an issue in sdr.
> 
> 
> 
> Yes. HDR on this game is crap.
> 
> Click to expand...
> 
> Don't use msi afterburner or RTSS as it messes with the game's HDR. Without it the game's HDR looks just fine.
> 
> Click to expand...
> 
> No it doesn't. Just try it with HDR off and you will see more vibrant colors. Red is red and not orange
Click to expand...

I did lol I'm not sure if it's an issue specific to the PG27UQ then but I played the whole game on my B7 OLED and HDR worked fine.


----------



## deadchip12

MistaSparkul said:


> acmilangr said:
> 
> 
> 
> 
> 
> deadchip12 said:
> 
> 
> 
> Ok I just tried Assassin's Creed Origins on this monitor. Does anyone feel the game's picture feels a bit washed out? Like black areas are grey. I remember being in a cave and the whole screen is grey instead of black. Also, in hdr mode, there's this graphics glitch where sparks appear around characters. Not an issue in sdr.
> 
> 
> 
> Yes. HDR on this game is crap.
> 
> Click to expand...
> 
> Don't use msi afterburner or RTSS as it messes with the game's HDR. Without it the game's HDR looks just fine.
Click to expand...

I closed rtss but it didn’t make a difference. The game still looks very good with vibrant colors but I notice the black areas are not truly black, especially at night. And that weird spark glitch as well. Your screenshots are from Odyssey though, yes? I’m talking about Origins, but I heard another user had the same issue with the sparks glitch in Odyssey as well: https://steamcommunity.com/app/812140/discussions/0/2656452469272261292/


----------



## MistaSparkul

deadchip12 said:


> MistaSparkul said:
> 
> 
> 
> 
> 
> acmilangr said:
> 
> 
> 
> 
> 
> deadchip12 said:
> 
> 
> 
> Ok I just tried Assassin's Creed Origins on this monitor. Does anyone feel the game's picture feels a bit washed out? Like black areas are grey. I remember being in a cave and the whole screen is grey instead of black. Also, in hdr mode, there's this graphics glitch where sparks appear around characters. Not an issue in sdr.
> 
> 
> 
> Yes. HDR on this game is crap.
> 
> Click to expand...
> 
> Don't use msi afterburner or RTSS as it messes with the game's HDR. Without it the game's HDR looks just fine.
> 
> Click to expand...
> 
> I closed rtss but it didn’t make a difference. The game still looks very good with vibrant colors but I notice the black areas are not truly black, especially at night. And that weird spark glitch as well. Your screenshots are from Odyssey though, yes? I’m talking about Origins, but I heard another user had the same issue with the sparks glitch in Odyssey as well: https://steamcommunity.com/app/812140/discussions/0/2656452469272261292/
Click to expand...

Ah my bad I mixed up Odyssey and Origins. I never played Origins so it's very possible that it just has a terrible HDR.


----------



## deadchip12

MistaSparkul said:


> deadchip12 said:
> 
> 
> 
> 
> 
> MistaSparkul said:
> 
> 
> 
> 
> 
> acmilangr said:
> 
> 
> 
> 
> 
> deadchip12 said:
> 
> 
> 
> Ok I just tried Assassin's Creed Origins on this monitor. Does anyone feel the game's picture feels a bit washed out? Like black areas are grey. I remember being in a cave and the whole screen is grey instead of black. Also, in hdr mode, there's this graphics glitch where sparks appear around characters. Not an issue in sdr.
> 
> 
> 
> Yes. HDR on this game is crap.
> 
> Click to expand...
> 
> Don't use msi afterburner or RTSS as it messes with the game's HDR. Without it the game's HDR looks just fine.
> 
> Click to expand...
> 
> I closed rtss but it didn’t make a difference. The game still looks very good with vibrant colors but I notice the black areas are not truly black, especially at night. And that weird spark glitch as well. Your screenshots are from Odyssey though, yes? I’m talking about Origins, but I heard another user had the same issue with the sparks glitch in Odyssey as well: https://steamcommunity.com/app/812140/discussions/0/2656452469272261292/
> 
> Click to expand...
> 
> Ah my bad I mixed up Odyssey and Origins. I never played Origins so it's very possible that it just has a terrible HDR.
Click to expand...

Tbh hdr still looks fine. The sun and the fire look very bright and stand out. It’s just that the black level seems lacking so contrast seems kinda off. In Odyssey, do you notice any scene where is supposed to be completely black but turns out grey?

And the sparkle glitch. If you look closely at character’s body during night time, you may see some sparkling lines. This seems to be an hdr problem in Odyssey as well as reported above.


----------



## MistaSparkul

I think it could be the fact that because it's a FALD IPS panel, it might have some trouble reaching low enough black depths, and instead just looks grey like an IPS typically ends up looking like. I played Odyssey on my OLED which has no problems with black levels/dimming zones so blacks appeared nice and inky. I could try firing up Odyssey on my X27 and see how that looks compared to the OLED.


----------



## deadchip12

MistaSparkul said:


> I think it could be the fact that because it's a FALD IPS panel, it might have some trouble reaching low enough black depths, and instead just looks grey like an IPS typically ends up looking like. I played Odyssey on my OLED which has no problems with black levels/dimming zones so blacks appeared nice and inky. I could try firing up Odyssey on my X27 and see how that looks compared to the OLED.


I don’t think this is ips vs oled problem. Other hdr games i tried on the pg27uq like horizon dawn have way better black and the contrast appears to be very strong. I tried Origins on my oled c7 just now and the overall black looks inkier due to oled’s higher contrast but still it was not as deep as I expected. Possibly it is just how the game is.

If you check Odyssey again, pls check for the spark glitch I mentioned. Look closely at the character during night time.


----------



## MistaSparkul

I went and re tested on the X27 but couldn't find anything out of the ordinary.


----------



## deadchip12

MistaSparkul said:


> I went and re tested on the X27 but couldn't find anything out of the ordinary.


Is it night time? This sparkle issue only appears when it’s night: https://i.imgur.com/1u4e5Tn.mp4


----------



## AngryLobster

Most games I've play in HDR seem to artistically choose grey blacks. It's not a display thing because it's visible even on OLED.

DMC5, RE2 Remake, Sekiro, etc. All these games produce greyish blacks.

The only game that got decently dark was Farcry 5.


----------



## MistaSparkul

deadchip12 said:


> MistaSparkul said:
> 
> 
> 
> I went and re tested on the X27 but couldn't find anything out of the ordinary.
> 
> 
> 
> Is it night time? This sparkle issue only appears when it’s night: https://i.imgur.com/1u4e5Tn.mp4
Click to expand...

It's night time. Camera is just making it appear much brighter.


----------



## mattxx88

this weekend i took some time to test mine, bleed quite inexistent, no dead pixels, i am very satisfied with it. I come from a 1440p VA panel (lg 32GK850G) and differences (in better) are evident
HDR both on games and videos is awesome as well as the res increase (from 1440p)
i also found the default calibratin fine, i tried tftcentral settings but i see the panel too dark

also in hdr with luminance setteted at 80 i found it too bright, and setted to 40, sweet spot for me

i keep hdr off from windows (it auto turns on when launching an supported game) and brought the slider up to 70 when on

last thing, i cannot get FF XV working with HDR but now i see it requires hdr ON from windows before starting the game

the division 2, Anthem and BF5 looks awesome with HDR, i'm finally in love with it


----------



## tinykitten

I received this email today, figured I'd post this if anyone still cares. I doubt anything is happening in the end but it is what it is for now.



> Dear ASUS Customer,
> 
> There should be an update in the next trhee to four weeks regarding the update Tool.
> Kind regards,
> Piet Gloede
> Asus Customer Service
> Asus Technical Support Site: http://support.asus.com


----------



## deadchip12

MistaSparkul said:


> deadchip12 said:
> 
> 
> 
> 
> 
> MistaSparkul said:
> 
> 
> 
> I went and re tested on the X27 but couldn't find anything out of the ordinary.
> 
> 
> 
> Is it night time? This sparkle issue only appears when it’s night: https://i.imgur.com/1u4e5Tn.mp4
> 
> Click to expand...
> 
> It's night time. Camera is just making it appear much brighter.
Click to expand...

Can confirm black is grey in Origins. Saw it on my oled as well. This happens in hdr mode only. Brightness is at default 50%; if I lower this then black becomes darker but the whole picture is dim overall and hdr picture is messed up. The fact that this game still has brightness slider in hdr mode signals sth wrong already. Not sure about Odyssey.


----------



## deadchip12

mattxx88 said:


> this weekend i took some time to test mine, bleed quite inexistent, no dead pixels, i am very satisfied with it. I come from a 1440p VA panel (lg 32GK850G) and differences (in better) are evident
> HDR both on games and videos is awesome as well as the res increase (from 1440p)
> i also found the default calibratin fine, i tried tftcentral settings but i see the panel too dark
> 
> also in hdr with luminance setteted at 80 i found it too bright, and setted to 40, sweet spot for me
> 
> i keep hdr off from windows (it auto turns on when launching an supported game) and brought the slider up to 70 when on
> 
> last thing, i cannot get FF XV working with HDR but now i see it requires hdr ON from windows before starting the game
> 
> the division 2, Anthem and BF5 looks awesome with HDR, i'm finally in love with it /forum/images/smilies/biggrin.gif


How was your VA panel? Is the black much better than this IPS one? I kinda regret purchasing this monitor for such high price instead of a VA tv because the blooming on ips is too much.

I’m surprised you set reference white to 40. On my monitor, reference white at 90 gives the highest max luminance (~1200 nits). Lowering it and the monitor will not reach its full hdr punchiness potential. This monitor really shines in bright scenes. It beats the oled with ease. But in dark scenes with tiny highlights it looks like ass due to blooming.


----------



## mattxx88

deadchip12 said:


> How was your VA panel? Is the black much better than this IPS one? I kinda regret purchasing this monitor for such high price instead of a VA tv because the blooming on ips is too much.
> 
> I’m surprised you set reference white to 40. On my monitor, reference white at 90 gives the highest max luminance (~1200 nits). Lowering it and the monitor will not reach its full hdr punchiness potential. This monitor really shines in bright scenes. It beats the oled with ease. But in dark scenes with tiny highlights it looks like ass due to blooming.


That LG was awesome (till i tired this asus anyway) the black is fine on VA but not MUCH better than this Asus
you have not to regret your purchase, trust me. the only (light) blooming i have is watching videos, ingame i don't notice issues

i setted it to 40 cause i noticed watching videos on youtube try this one for eg: 



 at 0:46 with hig luminance above 40/50 you cannot distinguish the white panles of the dj's console
there's other videos to make compares, i remember there waere a snake on a video and with high luminance you cannot see the scales of the skin, lowering it they came visible

i'm using it @92hz 4:4:4 10bit, tonight maybe i'll try if there's any difference with srgb


----------



## kot0005

I think my fan is busted, It sounds like a tractor lol. Even when Idle its literally annoying now. I have been playing a lot of division 2 in HDR..may be it killed it.

I dont even know why its spinning like that, the monitor is really cool to touch and its winter here.


----------



## deadchip12

mattxx88 said:


> deadchip12 said:
> 
> 
> 
> How was your VA panel? Is the black much better than this IPS one? I kinda regret purchasing this monitor for such high price instead of a VA tv because the blooming on ips is too much.
> 
> I’m surprised you set reference white to 40. On my monitor, reference white at 90 gives the highest max luminance (~1200 nits). Lowering it and the monitor will not reach its full hdr punchiness potential. This monitor really shines in bright scenes. It beats the oled with ease. But in dark scenes with tiny highlights it looks like ass due to blooming.
> 
> 
> 
> That LG was awesome (till i tired this asus anyway) the black is fine on VA but not MUCH better than this Asus
> you have not to regret your purchase, trust me. the only (light) blooming i have is watching videos, ingame i don't notice issues
> 
> i setted it to 40 cause i noticed watching videos on youtube try this one for eg:
> 
> 
> 
> at 0:46 with hig luminance above 40/50 you cannot distinguish the white panles of the dj's console
> there's other videos to make compares, i remember there waere a snake on a video and with high luminance you cannot see the scales of the skin, lowering it they came visible
> 
> i'm using it @92hz 4:4:4 10bit, tonight maybe i'll try if there's any difference with srgb
Click to expand...

Make sure you watch that youtube video on microsoft edge instead of chrome. Chrome will make hdr highlights blown out. At least that’s the case for me. I used Chrome to fine tune reference white and I also end up with 40-50, but if I use Edge it needs to be at 80-90.

As for the blooming, I notice it everwhere whenever a tiny bright spots on a black background. Maybe because your reference white is lower so hdr contents are dimmer so blooming is less severe


----------



## deadchip12

kot0005 said:


> I think my fan is busted, It sounds like a tractor lol. Even when Idle its literally annoying now. I have been playing a lot of division 2 in HDR..may be it killed it.
> 
> I dont even know why its spinning like that, the monitor is really cool to touch and its winter here.


I literally hear nothing coming out of my monitor, even when I put my ear near it after playing hdr games for hrs. Not sure why after hearing so much complains about the fan’s noise. My ps4 pro next to the monitor sounds like a tractor though lol.


----------



## mattxx88

deadchip12 said:


> Make sure you watch that youtube video on microsoft edge instead of chrome. Chrome will make hdr highlights blown out. At least that’s the case for me. I used Chrome to fine tune reference white and I also end up with 40-50, but if I use Edge it needs to be at 80-90.
> 
> As for the blooming, I notice it everwhere whenever a tiny bright spots on a black background. Maybe because your reference white is lower so hdr contents are dimmer so blooming is less severe


you're right man! with edge got no problems and raised luminance to 80 without issues, also 90 is fine
now i download that demos from 4kmedia.org and try them with CnX mediaplayer in windows

i also see a light bloom with black backgrounds (eg the red balls bouncing in above video) but it's not that annoying for me

i also lowered the SDR bar from windows hd colour manager 

what do you think about the colour? it's better to keep RGB or 4:4:4?


----------



## deadchip12

mattxx88 said:


> deadchip12 said:
> 
> 
> 
> Make sure you watch that youtube video on microsoft edge instead of chrome. Chrome will make hdr highlights blown out. At least that’s the case for me. I used Chrome to fine tune reference white and I also end up with 40-50, but if I use Edge it needs to be at 80-90.
> 
> As for the blooming, I notice it everwhere whenever a tiny bright spots on a black background. Maybe because your reference white is lower so hdr contents are dimmer so blooming is less severe
> 
> 
> 
> you're right man! with edge got no problems and raised luminance to 80 without issues, also 90 is fine
> now i download that demos from 4kmedia.org and try them with CnX mediaplayer in windows
> 
> i also see a light bloom with black backgrounds (eg the red balls bouncing in above video) but it's not that annoying for me
> 
> i also lowered the SDR bar from windows hd colour manager
> 
> what do you think about the colour? it's better to keep RGB or 4:4:4?
Click to expand...

I think they are the same. I simply do not touch the chroma settings, so it’s always rgb


----------



## deadchip12

mattxx88 said:


> deadchip12 said:
> 
> 
> 
> Make sure you watch that youtube video on microsoft edge instead of chrome. Chrome will make hdr highlights blown out. At least that’s the case for me. I used Chrome to fine tune reference white and I also end up with 40-50, but if I use Edge it needs to be at 80-90.
> 
> As for the blooming, I notice it everwhere whenever a tiny bright spots on a black background. Maybe because your reference white is lower so hdr contents are dimmer so blooming is less severe
> 
> 
> 
> you're right man! with edge got no problems and raised luminance to 80 without issues, also 90 is fine
> now i download that demos from 4kmedia.org and try them with CnX mediaplayer in windows
> 
> i also see a light bloom with black backgrounds (eg the red balls bouncing in above video) but it's not that annoying for me
> 
> i also lowered the SDR bar from windows hd colour manager
> 
> what do you think about the colour? it's better to keep RGB or 4:4:4?
Click to expand...

Oh and try the video below. The blooming will drive you crazy:

https://youtu.be/E3Bf3mq1Or8


----------



## saltedham

tinykitten said:


> I received this email today, figured I'd post this if anyone still cares. I doubt anything is happening in the end but it is what it is for now.


im still interested.


----------



## mattxx88

do you think this monitor will be capable of 4K 144Hz HDR without chroma sub-sampling (maybe with a firmware update?) when gpus can handle it? 
or DP 1.4 is physically limited and cannot be upgraded?


----------



## toncij

mattxx88 said:


> do you think this monitor will be capable of 4K 144Hz HDR without chroma sub-sampling (maybe with a firmware update?) when gpus can handle it?
> or DP 1.4 is physically limited and cannot be upgraded?


Display is locked to DP 1.4 and doesn't use compression so no, it will never do it. This is a solution for now. Maybe in 2022+ we get HDMI2.1 displays or better DP. So far only TVs like LG C9 OLED got HDMI 2.1 which is capable of what you ask for. 
Keep in mind Asus monitors have a latency from announcement to actual release of about 24+ months, so don't hold your breath.


----------



## toncij

deadchip12 said:


> I literally hear nothing coming out of my monitor, even when I put my ear near it after playing hdr games for hrs. Not sure why after hearing so much complains about the fan’s noise. My ps4 pro next to the monitor sounds like a tractor though lol.


Anyone opened the monitor to see if the fan could be torn apart and replaced with something better?


----------



## axiumone

toncij said:


> Anyone opened the monitor to see if the fan could be torn apart and replaced with something better?


I've opened mine up before. Didn't open the cover to see if the fans are serviceable unfortunately. Also, there are actually two fans inside.


----------



## stefxyz

So today was the day I was fed up enough with fan noise to take things into my own hand and disassemble the thing:


----------



## stefxyz

And as one expect it has a cheap as ... cooling solution. Basically there are 2 radial fans blowing air top down through the back of the monitor throigh some aluminium heatsinks over vrms and cpu and out of the bottom. Super cheap shi... stuff basically.

So yes there is carton in the bottom...


----------



## stefxyz

So what do I think first when I figure there is a pc in my monitor? Of course: how do I watercool this thing? To be continued...


----------



## Barefooter

You would think for the price of this monitor that they could spend a few extra bucks and put a quality fan inside


----------



## skingun

Looking forward to your results @stefxyz


----------



## Cyber Locc

Barefooter said:


> You would think for the price of this monitor that they could spend a few extra bucks and put a quality fan inside /forum/images/smilies/rolleyes.gif


I have not disassembled mine lol, but I can tell you on mine (and all the new revisions?) The fan is silent. 

I can barely hear it, even I put my ear up to the back. 


Also, sadly have to sell mine 😞. I downsized from a Desktop to a Laptop. I can't run this kind of horsepower anymore 😞. 

So if any is looking for one of these bad boys, https://www.overclock.net/#/topics/1724986?page=1

It is the new revision so the Black Crush firmware is there, the Fan is silent, BLB is minimal, and the blooming isn't too bad. Can take any pics you would like. PM me for more 🙂.


----------



## skingun

I have a new revision monitor and I can hear the fan but I am particularly sensitive to noise and the ambient dB level in my study is very low.


----------



## stefxyz

If its the cooling solution I am showing on these pics no matter what rev there is no way you cant hear it in a quite room. Just some people use stock fans of casesor rpms of above 1000 then of course it overshadows it. If you have a huge watercooled setup the fans of this mon are the loudest part of you rig except some coil whine eventually.


----------



## toncij

stefxyz said:


> If its the cooling solution I am showing on these pics no matter what rev there is no way you cant hear it in a quite room. Just some people use stock fans of casesor rpms of above 1000 then of course it overshadows it. If you have a huge watercooled setup the fans of this mon are the loudest part of you rig except some coil whine eventually.


I was asking for a few times if anyone opened it to check if it can be waterblocked and thank you, it seems a gpu block would fit nicely.


----------



## Exilon

The PG27UQ's internals looks like they took some parts out of a laptop line with the dual centrifugal fans. 

Do both always run? Unlike the X27, I can't see how the blower FPGA cooler would help cool the backlight array.


----------



## acmilangr

I would like to inform you that if fans runs always and they are too loud then it means you have dust inside the cooling system . 
Just use air compressed in the holes and it will be fine.

I did that and fixed my problem.


----------



## Martin778

I don't see a problem with Sunon fans tbh, that's like the de facto industry standard in notebooks.


----------



## acmilangr

Sunnon are on of the best fans in the world. Stop complaining about their quallity


----------



## saltedham

acmilangr said:


> Sunnon are on of the best fans in the world. Stop complaining about their quallity


wish the ones in my monitor were quiet.


----------



## kot0005

Mini led version announced.


----------



## kot0005

https://www.nvidia.com/en-us/geforc...timate-mini-led-hdr-4k-144hz-gaming-monitors/


----------



## deadchip12

kot0005 said:


> https://www.nvidia.com/en-us/geforc...timate-mini-led-hdr-4k-144hz-gaming-monitors/


Expect >1 year of delay with the price tag of ay least $3000.

Seriously. 50% more dimming zones but still ips panel. So blooming will still be horrendous.


----------



## ToTheSun!

deadchip12 said:


> Expect >1 year of delay with the price tag of ay least $3000.
> 
> Seriously. 50% more dimming zones but still ips panel. So blooming will still be horrendous.


50% less horrendous.


----------



## deadchip12

ToTheSun! said:


> deadchip12 said:
> 
> 
> 
> Expect >1 year of delay with the price tag of ay least $3000.
> 
> Seriously. 50% more dimming zones but still ips panel. So blooming will still be horrendous.
> 
> 
> 
> 50% less horrendous.
Click to expand...

33.3% to be more precise but still it will look pretty bad, unless asus can come up with a more advanced fald algorithm like what sony did with their fald tvs. But I’m pretty sure that’s not their priority.


----------



## CaliLife17

Will be interesting to see if they just replace the current PG27UQ model with this one in their lineup, or if the reposition the PG27UQ as a lower priced model and the new PG27UQX as the "High End" model.

I think you will see this come out sooner from announcement than what we got when the PG27UQ was announced as they ran into a lot of issues with the FPGA which caused delays for the PG27UQ. They are using FPGA for this one, so less tuning should be needed. 

I haven't been to bothered by blooming on my current PG27UQ, but any improvements are always welcome. I did have to go through 2x Acer X27s and 3x PG27UQs until I found one without any dead/stuck pixels. Sadly I feel QC will probably be the same on these as well.

Even so, will probably pick this up later this year when I do my upgrade to Ryzen 3900x


----------



## bobsled

I could potentially see myself upgrading to the MicroLED updated unit if prices are parity with existing PG27UQ listings, as a $3500 AU price just can't be justified from where I'm standing. The long awaited 35" PG35VQ could also be a tempting upgrade from my U3011 (2560x1600 60Hz).



> ASUS are also readying their ROG Swift PG35VQ for launch - this 3440x1440 21:9 35” G-SYNC ULTIMATE 1000 nit HDR gaming monitor takes the tech of their ROG Swift PG27UQ into a new, wider form factor, adds a 1800R curve, minimizes three of the bezels to nigh-unnoticable levels, and enables users to crank up the refresh rate to 200Hz.


----------



## aweir

Sempre said:


> Finally. Wish it was 32" though.


But it would *HAVE TO* cost 10 times as much.


----------



## CallsignVega

Lame. Look's like they cheapo'd out and are reusing the same 27" panel as in the PG27UQ/X27. Just with a more-zone back-light.


----------



## CallsignVega

Lame. Look's like they cheapo'd out and are reusing the same 27" panel as in the PG27UQ/X27. Just with a more-zone back-light.


----------



## CaliLife17

CallsignVega said:


> Lame. Look's like they cheapo'd out and are reusing the same 27" panel as in the PG27UQ/X27. Just with a more-zone back-light.


Not only that, it doesn't seem like they are doing anything to the brightness to increase the nits output for specular highlights as well either. I would of thought with the more zones and smaller LEDs you could of increased the nit brightness output. 

Also doesn't seem to be any color gamut improvement. The only changes it seems is just the fan speed option in OSD, and more dimmable zones.


----------



## Morkai

CaliLife17 said:


> Not only that, it doesn't seem like they are doing anything to the brightness to increase the nits output for specular highlights as well either. I would of thought with the more zones and smaller LEDs you could of increased the nit brightness output.
> 
> Also doesn't seem to be any color gamut improvement. The only changes it seems is just the fan speed option in OSD, and more dimmable zones.


Do you even want more brightness? I feel that the pg27uq is already a bit too bright and some highlights make me squint, using in a dim/dark room.
(maybe if used in daylight?), also, no current HDR certification requires more so probably no real incentive?

As far as color gamut goes, I mean it's a gaming monitor that comes factory-calibrated and passes HDR1000 and 99% adobe rgb which qualifies it as a professional monitor in most regards.
They have zero incentive to improve that near-perfection unless there's a higher HDR certification they want to aim for in the future.

I too feel seems like a pretty pointless upgrade for owners of the previous models. Could've just been called revision 03.


----------



## kot0005

So this new monitor has 2300 LED's packed into 576 zones instead of 384 in 384. I wonder if they can control all 2300 led individually or just the 576 zones.


----------



## CaliLife17

Morkai said:


> Do you even want more brightness? I feel that the pg27uq is already a bit too bright and some highlights make me squint, using in a dim/dark room.
> (maybe if used in daylight?), also, no current HDR certification requires more so probably no real incentive?
> 
> As far as color gamut goes, I mean it's a gaming monitor that comes factory-calibrated and passes HDR1000 and 99% adobe rgb which qualifies it as a professional monitor in most regards.
> They have zero incentive to improve that near-perfection unless there's a higher HDR certification they want to aim for in the future.
> 
> I too feel seems like a pretty pointless upgrade for owners of the previous models. Could've just been called revision 03.


I will always take more brightness for specular lighting. It is very noticeable to me to see the difference between my 65C9 OLED which is probably around 850 peak, vs a Q9F or Vizio PQX (Family members have them) which can get over 2,000 nits. But I don't think everyone would care to have more brightness, and I am not sure how much more cooling it would need with the added light output. 

In regards to color, I was just hoping we would see some improvement there if they are going to do a whole new model line for it. It was at 76% for BT.2020 and 92% for DCI-P3, which just gets it by the >90% DCI-P3 requirement for UHD Premium. sRGB color space doesn't really mean much since so many panels can hit 99%, but with HDR and WGC now becoming more and more prevalent, I would like to see improvements in that area. 

As you said I feel like its a very very minimal upgrade compared to the last one. Its the same panel, same FPGA, and same 8bit + FRC = 10 bit, so I know I shouldn't expect much since its pretty much the same monitor. It really only seems to be fan profile and more zones/FALD LEDs. Now saying all that, I am still interested in picking it up, I am just not fooling myself that this is a "BRAND NEW NEXT LEVEL UPGRADE" compared to the current PG27UQ/X27 models.


----------



## sitti

I have just got my PG27UQ only 5 days ago. I'm actually shocked to see PG27UQX was announced. If you were me, would you return it?

So far I haven't spotted any problem with it. Only thing I don't like is it's screen size. Feel too small for 4K. Today is 5th days with my new monitor, and I still can't get myself to like its 27" size. I would much prefer 32" but currently there isn't one with similar HDR capability, at least until ProArt PA32UCX is released.


----------



## bobsled

One other question; what HDMI revision are they throwing in this new revision? It could address the insufficient bandwidth issue of 2.0 at high refresh/depth.


----------



## sitti

bobsled said:


> One other question; what HDMI revision are they throwing in this new revision? It could address the insufficient bandwidth issue of 2.0 at high refresh/depth.


Still the same HDMI 2.0, not 2.1


----------



## Ferreal

sitti said:


> I have just got my PG27UQ only 5 days ago. I'm actually shocked to see PG27UQX was announced. If you were me, would you return it?
> 
> So far I haven't spotted any problem with it. Only thing I don't like is it's screen size. Feel too small for 4K. Today is 5th days with my new monitor, and I still can't get myself to like its 27" size. I would much prefer 32" but currently there isn't one with similar HDR capability, at least until ProArt PA32UCX is released.


I'd return it. My x27 will be on ebay when this comes out.


----------



## SmoothD

skingun said:


> Looking forward to your results @stefxyz


Same here, waiting for feedback about the solution how it went


----------



## kot0005

This Apple monitor looks insane...I wonder if its mini led array and how many they have. No word on adaptive sync.

https://www.apple.com/pro-display-xdr/


----------



## Morkai

kot0005 said:


> This Apple monitor looks insane...I wonder if its mini led array and how many they have. No word on adaptive sync.
> 
> https://www.apple.com/pro-display-xdr/


Most likely apple, as usual, aimed for max resolution/60Hz, so it probably can't compare to pg27uq/x27/similar as a gaming monitor, just like the sony oled pro monitors that cost piles of money and probably are great for video/picture editing but not gaming.


----------



## Fanu

kot0005 said:


> This Apple monitor looks insane...I wonder if its mini led array and how many they have. No word on adaptive sync.
> 
> https://www.apple.com/pro-display-xdr/


 32"

6016x3384 resolution (16:9)

10-bit native

Support for numerous color gamuts, including DCI-P3 and sRGB (the two most common for most end users)

500 cd/m² for SDR, 1,000 cd/m² sustained for HDR, 1,600 cd/m² peak

1,000,000:1 rated contrast ratio with 576 local dimming zones

Comes in both glossy and matte options, which I'll explain in the quote at the bottom (Apple's words, not mine)

Connects via Thunderbolt 3 USB-C port, which also acts as a hub for the other 3 USB-C ports.


99% 60Hz and no freesync - this is a monitor for professional use (photo/video editing) considering the number of supported color gamuts 
not meant for gaming or any casual use due to its price and hardware requirements (6K is a lot more demanding than 5K which is already brutally demanding)


----------



## CallsignVega

I never got this "60 Hz is fine for professional work" mentality. 60 Hz mouse movement/precision is TERRIBLE and seriously hampers work-flow efficiency.


----------



## Fanu

CallsignVega said:


> I never got this "60 Hz is fine for professional work" mentality. 60 Hz mouse movement/precision is TERRIBLE and seriously hampers work-flow efficiency.


there are more important things to professionals than monitors refresh rate

how is work flow efficiency seriously hampered @60Hz while editing photos in photoshop or scrubbing the timeline while video editing? thats gross exaggeration 
users of this monitor surely will not complain that dragging windows around isnt as smooth as it is on gaming monitors lol


----------



## animeowns

*pg27uq going for $900 online*

Wow people are really trying to get rid of the pg27uq now that asus annouce a new display. lowest price I have ever since this display online $900 on ebay


----------



## AngryLobster

I think the new models "variable fan speed" alone is worth dumping my PG27UQ.


----------



## hmcindie

CallsignVega said:


> I never got this "60 Hz is fine for professional work" mentality. 60 Hz mouse movement/precision is TERRIBLE and seriously hampers work-flow efficiency.


I agree. Using the x27 for work is way better than an using a 60hz imac "retina" monitor.


----------



## hmcindie

Fanu said:


> there are more important things to professionals than monitors refresh rate


Like having piss poor uniformity like imacs? I've never seen anything particularly "professional" about any "professional" monitor except those that cost over 20k.


----------



## tinykitten

I haven't tried this yet so please use it at your own risk for now. I'll report back once I'm home.
https://dlcdnets.asus.com/pub/ASUS/LCD Monitors/PG27UQ/PG27U_FW_Updater_V3.2.zip


----------



## kx11

tinykitten said:


> I haven't tried this yet so please use it at your own risk for now. I'll report back once I'm home.
> https://dlcdnets.asus.com/pub/ASUS/LCD Monitors/PG27UQ/PG27U_FW_Updater_V3.2.zip



i'll give it a run


----------



## kx11

yeah this thing doesn't work


it broke the OSD menu , nothing works now


edit : nvm osd works now , had to unplug the power cable from the monitor 1st


----------



## tinykitten

kx11 said:


> yeah this thing doesn't work
> 
> 
> it broke the OSD menu , nothing works now
> 
> 
> edit : nvm osd works now , had to unplug the power cable from the monitor 1st


I got the link from ASUS support so I figured it should be fine. Not sure if there's an announcement about it anywhere but whatever, I'm glad it's working atleast.


----------



## Morkai

Turned off g-sync, did factory reset, still:
(monitor still works when this happens, it just rolled back the firmware)

After rebooting, setting refresh to 60, and turning off any open program, it seems to have transfered ok and is now calculating firmware hash.


----------



## Morkai

Took 1 hour or so, successful, 144Hz sdr works as expected and can turn off hdr popup msg. nice!


----------



## animeowns

AngryLobster said:


> I think the new models "variable fan speed" alone is worth dumping my PG27UQ.


I am happy I sold mine now I can just sit back and wait for hopefully an October release of this Pg27UQX.


----------



## kx11

Morkai said:


> Turned off g-sync, did factory reset, still:
> (monitor still works when this happens, it just rolled back the firmware)
> 
> After rebooting, setting refresh to 60, and turning off any open program, it seems to have transfered ok and is now calculating firmware hash.





i did that , it worked 100%


more options in the OSD now , I'll have to check more about what's been fixed


----------



## Morkai

animeowns said:


> I am happy I sold mine now I can just sit back and wait for hopefully an October release of this Pg27UQX.


You can hope for an october release, yes.
But which year?!?!?!?!??!?!?!?!?!?!?!


----------



## Speezy

kx11 said:


> i did that , it worked 100%
> 
> 
> more options in the OSD now , I'll have to check more about what's been fixed


I just updated my monitors firmware and when i use hdr now i cant change the reference white(nits). Can you check is it the same for you?


----------



## tinykitten

Morkai said:


> You can hope for an october release, yes.
> But which year?!?!?!?!??!?!?!?!?!?!?!


Pretty much, I'd be surprised if a 2019 release happens.


----------



## AngryLobster

Man I really hope this means they can put out a firmware update to change the fan profile. If the PG27UQX adopts the same 0RPM like Asus GPU's while using the same module, maybe we can get an update too.


----------



## Morkai

Speezy said:


> I just updated my monitors firmware and when i use hdr now i cant change the reference white(nits). Can you check is it the same for you?


Same. Though changing it only ever seemed to mess up the image.


----------



## kx11

Speezy said:


> I just updated my monitors firmware and when i use hdr now i cant change the reference white(nits). Can you check is it the same for you?



yeah same here , it's set to 1000nits now


----------



## Morkai

kx11 said:


> yeah same here , it's set to 1000nits now


Mine just has a black field for reference white. It looks just like the old default setting of 80 though.


----------



## mattxx88

tinykitten said:


> I haven't tried this yet so please use it at your own risk for now. I'll report back once I'm home.
> https://dlcdnets.asus.com/pub/ASUS/LCD Monitors/PG27UQ/PG27U_FW_Updater_V3.2.zip


Thanks for sharing
is there a guide to update it?
Mine works perfect, should i keep it as it is?


----------



## tinykitten

mattxx88 said:


> Thanks for sharing
> is there a guide to update it?
> Mine works perfect, should i keep it as it is?


You can follow the instructions shown once you run the program (disable monitor overclock, Windows power settings etc.) then just hit Update and that's all there is to it. Takes about 45-50 minutes. Personally I had no issues updating.
If you still have the old firmware it's up to you how much you actually value using 144hz with proper blacks. That's the main draw to updating the firmware, aside from minor misc. improvements.


----------



## Speezy

Morkai said:


> Same. Though changing it only ever seemed to mess up the image.


Yeah true but i quess its a new bug in the firmware because why would they remove it


----------



## Speezy

kx11 said:


> yeah same here , it's set to 1000nits now


I'm talking about the reference white(nits) option as the peak brightness has always been set to 1000 and you cant change it but now there is no way to change the hdr overall brightness


----------



## bmgjet

Just finished firmware update.
Took 1 hour 3 mins.
Blacks look better at 144hz. Cant tell any other difference with anything else.


----------



## animeowns

I wonder if we will be able to use the firmware update once the new monitor releases they said it will still have the same black crush when using 144hz on the newer model I feel it should be priced exactly the same if they can't fix something like that on day one as the previous pg27uq or at least a little cheaper.


----------



## saltedham

thx for the heads up about the firmware bois.


----------



## skingun

It will probably be more expensive. Micro LEDs in a gaming monitor is a first and products at the front line of innovation always attract a heavy surcharge.


----------



## Speezy

why the firmware update hasn't been officially released yet? It's kinda weird that its only here because some guy got it from asus support.


----------



## tinykitten

Speezy said:


> why the firmware update hasn't been officially released yet? It's kinda weird that its only here because some guy got it from asus support.


 Well, to clarify: I sent ASUS support emails, quoted/linked their own FAQ regarding the firmware update for over half a year now and asked for concrete updates. Recently I ended up getting a reply stating "Please have a look at this:" and they linked the very same FAQ I kept quoting for over half a year, that's it. I somewhat snapped and may or may not have told them to f*ck off in a more or less polite way. Around two weeks later I received another email with that link I posted here.


I don't know if ASUS made the tool "public" by now, and I don't care anymore if they do. I got really tired with ASUS being so silent regarding this tool, FAQ got updated Nov. 2018 I believe. Plus the monitor scene seems very uninspiring to me recently. Sure, PG27UQX, I can see it but it's not enough reason for me to "upgrade". I don't understand the PG349Q release either in 2019, maybe I'm missing the point. I'm out for now in terms of monitors and will be back when OLEDs or equivalents see the light of the day.


----------



## Speezy

Yeah okay well i was kinda pissed about it aswell waiting a year for the update and now the black crush is finally fixed but do you have any idea about why this new update wont allow reference white to be changed anymore in hdr? No idea if it's intentional or a bug.


----------



## profundido

Leaving this here for those whom it may interest:

https://www.overclock.net/forum/44-monitors-displays/1629088-predator-x27-175.html#post28004360


----------



## bmgjet

My PG27UQ is a release day unit. Updated fine you have to follow the instructions to the letter as in disconnect all other screens. Make sure computers set to not put screen to sleep. Make screen run at 4K 60hz. Not touch computer as its updating.

I bet those techs, Let the screen go to sleep which bricked the firmware update. Then they are just blaming it on hardware problem.


----------



## saltedham

tinykitten said:


> Well, to clarify: I sent ASUS support emails, quoted/linked their own FAQ regarding the firmware update for over half a year now and asked for concrete updates. Recently I ended up getting a reply stating "Please have a look at this:" and they linked the very same FAQ I kept quoting for over half a year, that's it. I somewhat snapped and may or may not have told them to f*ck off in a more or less polite way. Around two weeks later I received another email with that link I posted here.
> 
> 
> I don't know if ASUS made the tool "public" by now, and I don't care anymore if they do. I got really tired with ASUS being so silent regarding this tool, FAQ got updated Nov. 2018 I believe. Plus the monitor scene seems very uninspiring to me recently. Sure, PG27UQX, I can see it but it's not enough reason for me to "upgrade". I don't understand the PG349Q release either in 2019, maybe I'm missing the point. I'm out for now in terms of monitors and will be back when OLEDs or equivalents see the light of the day.


glad someone was laying into them. no more buying electronics early. you get bent over, then the run around.


----------



## Babryn25

Reference White nits setting is grayed out after update.


----------



## l88bastar

Any firmware update for the Acer X27 model :-(


----------



## Glerox

tinykitten said:


> Well, to clarify: I sent ASUS support emails, quoted/linked their own FAQ regarding the firmware update for over half a year now and asked for concrete updates. Recently I ended up getting a reply stating "Please have a look at this:" and they linked the very same FAQ I kept quoting for over half a year, that's it. I somewhat snapped and may or may not have told them to f*ck off in a more or less polite way. Around two weeks later I received another email with that link I posted here.
> 
> 
> I don't know if ASUS made the tool "public" by now, and I don't care anymore if they do. I got really tired with ASUS being so silent regarding this tool, FAQ got updated Nov. 2018 I believe. Plus the monitor scene seems very uninspiring to me recently. Sure, PG27UQX, I can see it but it's not enough reason for me to "upgrade". I don't understand the PG349Q release either in 2019, maybe I'm missing the point. I'm out for now in terms of monitors and will be back when OLEDs or equivalents see the light of the day.


Man, it's so frustrating... I sent my PG27UQ back in April to Asus and had to wait one month for the update... I should have waited 1 month and do it myself.
At this point, I thought Asus would never release the firmware to public.



Babryn25 said:


> Reference White nits setting is grayed out after update.


HDR10 is supposed to show a predetermined brightness. For example, if the dev wants this light to be 1000nits, it will show 1000nits.
The reference white (nits) was a weird setting.


----------



## Babryn25

Glerox said:


> HDR10 is supposed to show a predetermined brightness. For example, if the dev wants this light to be 1000nits, it will show 1000nits.
> The reference white (nits) was a weird setting.


I used tftcentral recommended setting and everything was great. Now I can't even look at screen in HDR. It is just too bright.


----------



## Psycrow

What do you mean you sold urs now ? 
You think there will be a fan less model in 20xx ?
Water cooled perhaps 

Btw what do this update fix ? 

My monitor one of the new series that has the latest update i guess
Since my serie number has an "S" in it and it means its new


----------



## kot0005

Booked a RMA with Asus, they are sending it to this place.

https://www.google.com.au/search?q=...=#lrd=0x6ad6422df23923b7:0x210bb94eb57cc35e,1,,,


GOD HELP ME.


----------



## Zenairis

*X27*



l88bastar said:


> Any firmware update for the Acer X27 model :-(


I do not recommend doing it. I had it done and they sent me a panel with what looked like a red finger print on the screen so I sent it back they sent me another one without the firmware update on it and has dead pixels. This is after spending $100 on shipping btw.


----------



## l88bastar

Zenairis said:


> I do not recommend doing it. I had it done and they sent me a panel with what looked like a red finger print on the screen so I sent it back they sent me another one without the firmware update on it and has dead pixels. This is after spending $100 on shipping btw.


YIKES! Thanks for the heads up...yea that was always a worry for me as mine is pixel perfect....I guess I will just grin and bear the 4k120hz!


----------



## Sanders2133

When I start the update it loads until preparing to update, calculating firmware hash and I get the error:

Firmware could not be updated.

Hash Mismatch.


Followed every step, using DP, reset screen to default, disabled overclock, disabled gsync and put it in normal mode, 4k @60hz, closed all programs, no 2nd monitor/ usb device plugged in, onscreen gameplay features deactivated, tried restarting pc.

Also the serial number is 6, so it is qualified for the firmware update, it came from the initial release.


What the hell am I doing wrong?


----------



## bmgjet

Sanders2133 said:


> When I start the update it loads until preparing to update, calculating firmware hash and I get the error:
> 
> Firmware could not be updated.
> 
> Hash Mismatch.
> 
> 
> Followed every step, using DP, reset screen to default, disabled overclock, disabled gsync and put it in normal mode, 4k @60hz, closed all programs, no 2nd monitor/ usb device plugged in, onscreen gameplay features deactivated, tried restarting pc.
> 
> Also the serial number is 6, so it is qualified for the firmware update, it came from the initial release.
> 
> 
> What the hell am I doing wrong?


Using the orignal DP cable?
Other wise it just means that the firmware thats on your screen isnt the firmware its expecting to update from.


----------



## Sanders2133

bmgjet said:


> Using the orignal DP cable?
> Other wise it just means that the firmware thats on your screen isnt the firmware its expecting to update from.


Yes. Meanwhile I also updated the Asus driver that actually is on their support page and also a DP 1.3/1.4 update for my GTX1080ti. Still the same error.

Edit: Tried again today, did nothing different and for some reason it now worked and updated just fine. 



/solved


----------



## fleggy

outdated reply, deleted


----------



## TimesNeverWaste

MY PG27UQ comes in tomorrow and im so excited. I am going to check to see if it has the firmware update but have the updater ready just incase


----------



## mattxx88

TimesNeverWaste said:


> MY PG27UQ comes in tomorrow and im so excited. I am going to check to see if it has the firmware update but have the updater ready just incase


it's 3 month now i have mine, and i can say it's the best monitor ever got in my hands 
i think you won't regret your purchase


----------



## animeowns

TimesNeverWaste said:


> MY PG27UQ comes in tomorrow and im so excited. I am going to check to see if it has the firmware update but have the updater ready just incase


the updated model is releasing this year. Couldn't wait huh?


----------



## TimesNeverWaste

I saw the announcement but I remember when this was originally shown it took 2 years for it to come out. Also if its any more expensive then this one was I really couldn't justify it. I have wanted this monitor since the day it was shown off. Everytime I go to the local Microcenter I play with it and want it so bad. My PG279Q died on me the other day so was a perfect excuse to buy this one. I saw the new ones backlight system but it looks really expensive and it still has the Displayport 1.4 drawbacks. Ill update all of you later today if it comes with the update or not and apply it accordingly


----------



## CallsignVega

You automatically add a year to any date given by ASUS.


----------



## kot0005

damm I have a bunch of dead pixels on my monitor, I could find 9. Only visible on white, green and red, cant see em on black and blue.


----------



## TimesNeverWaste

Good thing it has a 3 year warranty lol


----------



## Lockjaw333

kot0005 said:


> damm I have a bunch of dead pixels on my monitor, I could find 9. Only visible on white, green and red, cant see em on black and blue.



Looks like dust behind the panel. My X27 came with two spots of dust. I was able to dislodge one by tapping on the spot and also on the top of the frame. The other spot I got to move a little bit, but it became stuck essentially at its now permanent resting space.


Luckily the second spot is at the top right corner of the panel, so doesn't bother me. Still this is ridiculous for a $2K monitor.


Try gently tapping on the spot and wiggling your finger there with slight pressure. Also try tapping the top frame. It might fall down if you're lucky.


----------



## ESRCJ

I have a few questions regarding this panel. How loud is the fan? I've gone through 4 defective 4K 144Hz panels thus far (1x X27, 3x XB273K) and the fans were barely noticeable on any of them. Also, how is QC? Every one of the Acer monitors I ordered had dead pixels. The XBs all had white dead pixels and so they were only visible with dark background.


----------



## Lumbeechief081

*I'm f**ked!*



tinykitten said:


> I haven't tried this yet so please use it at your own risk for now. I'll report back once I'm home.
> https://dlcdnets.asus.com/pub/ASUS/LCD Monitors/PG27UQ/PG27U_FW_Updater_V3.2.zip





Speezy said:


> I just updated my monitor's firmware and when I use hdr now I can't change the reference white(nits). Can you check is it the same for you?


I updated my monitor's firmware and now HDR isn't working properly! I can't change the Reference White (nits) because it's greyed out. Everything just looks so dark and dull in HDR now, even on the PS4 PRO. HDR content is completely unviewable for me now. I wish I didn't update it! What am I supposed to do now? I don't want to wait over a month to receive my monitor back from ASUS's RMA department. The ASUS PG27UQ is the only monitor I have to use on my PC. I absolutely have no other monitor. I'll be without a monitor if I send the ASUS PG27UQ in for RMA. They don't even have an advance RMA replacement where they send me a replacement and then I ship the defective one back. WṬF? Why can't there be a way to downgrade the firmware???? This is a f**king headache! :cryingsmi:cryingsmi:cryingsmi:cryingsmi:cryingsmi


----------



## kot0005

Lockjaw333 said:


> Looks like dust behind the panel. My X27 came with two spots of dust. I was able to dislodge one by tapping on the spot and also on the top of the frame. The other spot I got to move a little bit, but it became stuck essentially at its now permanent resting space.
> 
> 
> Luckily the second spot is at the top right corner of the panel, so doesn't bother me. Still this is ridiculous for a $2K monitor.
> 
> 
> Try gently tapping on the spot and wiggling your finger there with slight pressure. Also try tapping the top frame. It might fall down if you're lucky.


I dont think its dust , they are in multiple locations.


----------



## TimesNeverWaste

Lumbeechief081 said:


> I updated my monitor's firmware and now HDR isn't working properly! I can't change the Reference White (nits) because it's greyed out. Everything just looks so dark and dull in HDR now, even on the PS4 PRO. HDR content is completely unviewable for me now. I wish I didn't update it! What am I supposed to do now? I don't want to wait over a month to receive my monitor back from ASUS's RMA department. The ASUS PG27UQ is the only monitor I have to use on my PC. I absolutely have no other monitor. I'll be without a monitor if I send the ASUS PG27UQ in for RMA. They don't even have an advance RMA replacement where they send me a replacement and then I ship the defective one back. WṬF? Why can't there be a way to downgrade the firmware???? This is a f**king headache! :cryingsmi:cryingsmi:cryingsmi:cryingsmi:cryingsmi


So I just got my monitor and it came with the updated firmware so not sure what it looked like before the update but I played Sea of Thieves, Forza Horizon 4, Halo Wars 2 and Battlefield V in HDR and they all looked amazing. Colors looked great and even though Battlefield V has a dull pallet I enjoyed it. Only game that wasnt so great was Black Ops 4 in HDR. I found the dark areas were to dark and even though the colors looked right it wasn't great so ended up turning it off in that game. What games are you playing that dont look right?


----------



## TimesNeverWaste

kot0005 said:


> I dont think its dust , they are in multiple locations.


From the pictures I feel they look like dust as well because it seems to be more darker and goes wider then just 1 pixel. I take your word for it though since you physically have the monitor in front of you.


----------



## Lumbeechief081

TimesNeverWaste said:


> So I just got my monitor and it came with the updated firmware so not sure what it looked like before the update but I played Sea of Thieves, Forza Horizon 4, Halo Wars 2 and Battlefield V in HDR and they all looked amazing. Colors looked great and even though Battlefield V has a dull pallet I enjoyed it. Only game that wasnt so great was Black Ops 4 in HDR. I found the dark areas were to dark and even though the colors looked right it wasn't great so ended up turning it off in that game. What games are you playing that dont look right?


Well obviously I updated my monitor myself with a sketchy tool someone posted stating "use at your own risk" and that's is where I f**ked up. I had this monitor since May and now it's f**ked up because I tried to update it myself. This is all ASUS's fault from the beginning! If they would have uploaded the firmware update like they promised I wouldn't have updated the monitor with some link someone posted on these forums! This monitor was screwed up from the beginning! ASUS should have never released a half-complete product! The 144Hz with its chroma subsampling and black crush! The PG27UQX has the exact same panel and problems! The only thing new is the miniLEDs! They should have at least include Displayport 2.0!


----------



## tinykitten

Lumbeechief081 said:


> ...sketchy tool someone posted...


 This might be a you problem considering most if not all people had no problem updating. This tool is as sketchy as it gets coming from ASUS support I guess (bloop - it's in german but you get the point).


Also hold up, you stated you have the PG27UQ since May.. 2019? If so you had the updated firmware already and thus zero reason to upgrade.


----------



## TimesNeverWaste

Lumbeechief081 said:


> Well obviously I updated my monitor myself with a sketchy tool someone posted stating "use at your own risk" and that's is where I f**ked up. I had this monitor since May and now it's f**ked up because I tried to update it myself. This is all ASUS's fault from the beginning! If they would have uploaded the firmware update like they promised I wouldn't have updated the monitor with some link someone posted on these forums! This monitor was screwed up from the beginning! ASUS should have never released a half-complete product! The 144Hz with its chroma subsampling and black crush! The PG27UQX has the exact same panel and problems! The only thing new is the miniLEDs! They should have at least include Displayport 2.0!


Is the issue happening with HDMI HDR with the PS4 Pro and I am guessing a PC as well? Can you post a picture? Just wondering if mine looks the same.


----------



## Lumbeechief081

TimesNeverWaste said:


> Is the issue happening with HDMI HDR with the PS4 Pro and I am guessing a PC as well? Can you post a picture? Just wondering if mine looks the same.


Yes, HDR doesn't work properly on both PC and PS4 PRO since the update.


----------



## Lumbeechief081

Does anyone know how to access the debug/service menu on the monitor? I accessed it by accident a few times before and I don't know how I did it. I'm hoping there is a setting I can mess with that may get reference white (nits) not to be greyed out any longer. Below is an image posted by someone on hardforum who also accessed it by accident.


----------



## Lumbeechief081

tinykitten said:


> This might be a you problem considering most if not all people had no problem updating. This tool is as sketchy as it gets coming from ASUS support I guess (bloop - it's in german but you get the point).
> 
> 
> Also hold up, you stated you have the PG27UQ since May.. 2019? If so you had the updated firmware already and thus zero reason to upgrade.


S***, I don't know why I thought it was this year I got the monitor. It was actually sometime last year. I think it may have been in November.


----------



## hmcindie

Lumbeechief081 said:


> Yes, HDR doesn't work properly on both PC and PS4 PRO since the update.


Eh? The white nits greyed out thing seems to be a thing. Because my Acer X27 (same panel) has also a greyed out white nits setting while in HDR (updated firmware). I think that's how it's supposed to be and then you change the HDR settings in game instead of the monitor. I bet all the PG27UQ monitors with the updated firmware have greyed out white nits during HDR.


----------



## tinykitten

Lumbeechief081 said:


> S***, I don't know why I thought it was this year I got the monitor. It was actually sometime last year. I think it may have been in November.


Same thing still applies if you bought retail in November, it has up to date firmware. The old firmware was shipped on PG27UQs bought before the end of July 2018 or something like that.


----------



## Lumbeechief081

Dude, I honestly can't say for sure when I got the monitor. November is just an estimated guess. Going through my transactions in my bank account I can't find it. I had too switch checking accounts due to fraud in December. I know one thing though, this monitor didn't have the lastest update when I purchased it because it has Black crush, that is until I just updated it and got f***ed in the process. Reference White (Nits) is now greyed out.


----------



## Lumbeechief081

tinykitten said:


> Same thing still applies if you bought retail in November, it has up to date firmware. The old firmware was shipped on PG27UQs bought before the end of July 2018 or something like that.


This guy has the monitor and just purchased it recently, thus already having the latest firmware. The video was uploaded May 27. See where reference white (nits) isn't greyed out for him? https://youtu.be/uT12dYqePiM?t=524 Well it is greyed out for me after updating the firmware.


----------



## Exilon

hmcindie said:


> Eh? The white nits greyed out thing seems to be a thing. Because my Acer X27 (same panel) has also a greyed out white nits setting while in HDR (updated firmware). I think that's how it's supposed to be and then you change the HDR settings in game instead of the monitor. I bet all the PG27UQ monitors with the updated firmware have greyed out white nits during HDR.


X27's reference nits can be changed by going into the mode menu while in HDR mode and cycling through the 5 different modes, each which corresponds to a different brightness. Why they couldn't fit a better method into the OSD is beyond me.


----------



## Lumbeechief081

tinykitten said:


> Same thing still applies if you bought retail in November, it has up to date firmware. The old firmware was shipped on PG27UQs bought before the end of July 2018 or something like that.


I'm starting to believe that reference white (nits) is suppose to be greyed out after the firmware update. I asked this guy on YouTube if it is greyed out for him and he said yes. He must be on the latest update as well because the video was uploaded Mar 5.


----------



## Lumbeechief081

Okay, so I looked at the manual for the ASUS PG35VQ and as it turns out reference white (nits) is indeed suppose to be greyed out, as shown in the link below. https://i.postimg.cc/8cDym3qK/oiuo.jpg
Sadly, I find that viewing the same HDR content (before updating) looked way better because I was able to adjust the reference white (nits). At stock, it was 80nits, but I had it adjusted to 100. Having this option resulted in HDR that looked far better to me than not having it at all anymore. I don’t understand why ASUS had to remove it!!


----------



## bmgjet

Lumbeechief081 said:


> Okay, so I looked at the manual for the ASUS PG35VQ and as it turns out reference white (nits) is indeed suppose to be greyed out, as shown in the link below. https://i.postimg.cc/8cDym3qK/oiuo.jpg
> Sadly, I find that viewing the same HDR content (before updating) looked way better because I was able to adjust the reference white (nits). At stock, it was 80nits, but I had it adjusted to 100. Having this option resulted in HDR that looked far better to me than not having it at all anymore. I don’t understand why ASUS had to remove it!!


Screen adjusting the nits goes against the HDR standard.
You can still adjust it from windows, the game your playing or your media player.


----------



## hmcindie

Exilon said:


> X27's reference nits can be changed by going into the mode menu while in HDR mode and cycling through the 5 different modes, each which corresponds to a different brightness. Why they couldn't fit a better method into the OSD is beyond me.


What? I went ahead and flicked through the modes but they only show they've changed stuff, the brightness really didn't change. I don't think you're supposed to change the HDR brightness through the monitor weird as it is.


----------



## AngryLobster

FYI I have a Nov 2018 manufactured monitor and I've always been able to chance the reference nits in HDR. It has the latest firmware with HDR prompt toggle and 144hz chroma settings.


----------



## Babryn25

bmgjet said:


> Screen adjusting the nits goes against the HDR standard.


If you read tftcentral review again - "We carried out these measurements first of all with the default 80 'white reference' setting but found that the content targets were being exceeded by quite a lot and the screen was basically too bright. Content mastered at 400 cd/m2 was being shown at around 650 cd/m2". 

Basically screen pushes more brightness on all levels. Lowering white nits to 52 actually fixed this and everything was shown like it was supposed to according to HDR standard. I borrowed a meter and measured ~1280 nits at 1000 nits target 10% window. No wonder I can't look at screen no more, it is much brighter than it is supposed to be, was before. I don't know how to fix that properly with nits setting disabled. For the moment lowering contrast to 47 has the right amount of nits.


----------



## axiumone

Anyone notice that the display doesn't completely shut off after win 1903 update? Maybe it as like this before, but I never noticed. On shutdown, some of the backlight still stays on. It's pretty dim, but noticable if there are no lights in the room. I have to manually shut it off with the switch now.


----------



## mattxx88

Mine shuts down correctly
*i did a clean OS install after 1903


----------



## Morkai

I had massive problems with the latest nvidia driver, anyone else?
Screen going black, dimming zones getting stuck off, losing DP connection, etc.

Clean reinstall, didn't help. Reverted to old driver, all good.


----------



## mattxx88

updated them today and did a 4h gaming session without issues


----------



## fleggy

Morkai said:


> I had massive problems with the latest nvidia driver, anyone else?
> Screen going black, dimming zones getting stuck off, losing DP connection, etc.
> 
> Clean reinstall, didn't help. Reverted to old driver, all good.


Hi, I had a similar issues. I checked the driver version in NVCP and there was an older DHC version instead of the updated one. So I unplugged the network cable, ran DDU and installed the latest driver again. It was for the first time when DHC "infiltrated" my rig.


----------



## Morkai

fleggy said:


> Hi, I had a similar issues. I checked the driver version in NVCP and there was an older DHC version instead of the updated one. So I unplugged the network cable, ran DDU and installed the latest driver again. It was for the first time when DHC "infiltrated" my rig.


Did the same, let's see how it holds up.

[Edit: No issues after DDU+reinstall, first time ever "clean install" hasn't been good enough for me.]


----------



## koc6

Hi guys, im in 
Well, its here on my disk and its just perfect, no issues at all, maybe im just lucky, 
out of the box without any calibration the monitor its just amazing, i was thought HDR its not worth it, but honestly guys HDR it just amazing, i really want to replay all my games just to enjoy the amazing lights and colors.

Any recommend calibration guys, i read that 98hz best setting for 10bit and RGBs4:4:4 ?
also the brightness set to 80 ?


----------



## bigelvis

kot0005 said:


> nvm it fixed itself somehow. Played SWBF2 and god HDR is just on a whole new level..the lighting , the colors are so much lively....Just forget Haloing...u will not notice it during gameplay at all..


Hi

Would you say that blacks in Swbf2 are as black as the monitor bazel? 
I have at the moment 4k display from LG with ips panel (27ud58) and blacks are only grey so Im really looking to upgrade to this Asus or acer x27. 

Cheers

Cheers


----------



## kot0005

Buy the pg35vq or pg27uqx dont buy 1st get. They have lots of issues.


----------



## Billy McDowell

I thought I would mention that microcenter is selling this for $1499 and i got bestbuy to price match it online via chat. I did bestbuy because i can buy a 4 year warranty from them for $103 and considering all the issues this has it is worth it to have them exchange it for me for store credit to get another one in the future. I honestly buy all my expensive stuff i expect to break early from bestbuy. 

https://www.microcenter.com/product...r-aura-sync-pre-calibrated-gaming-led-monitor

https://www.bestbuy.com/site/asus-r...onitor-with-hdr-black/6260905.p?skuId=6260905


----------



## Billy McDowell

So I ended up getting the monitor today and the build date is Aug. 2018 should i try to update firmware or not? Can someone list me a pros/cons list between doing this and not.


----------



## tinykitten

Billy McDowell said:


> So I ended up getting the monitor today and the build date is Aug. 2018 should i try to update firmware or not? Can someone list me a pros/cons list between doing this and not.


Go to System Setup. New firmware should include the options "Warning Message (HDR)" and "DP SDR YCbCr sRGB Gamma" below "Display SDR Input". First option removes the onscreen message when you enable/disable HDR, second option fixes black crush in 144Hz. If you have these options you're good and have no reasons to update.


----------



## Billy McDowell

tinykitten said:


> Go to System Setup. New firmware should include the options "Warning Message (HDR)" and "DP SDR YCbCr sRGB Gamma" below "Display SDR Input". First option removes the onscreen message when you enable/disable HDR, second option fixes black crush in 144Hz. If you have these options you're good and have no reasons to update.


OK I indeed have the updated firmware then. I am curious then because I read on here people with the update can't use the nits when hdr mode is on to move them up or down but i do have that ability so is there other who can use this ability as well with the update?


----------



## tinykitten

Billy McDowell said:


> OK I indeed have the updated firmware then. I am curious then because I read on here people with the update can't use the nits when hdr mode is on to move them up or down but i do have that ability so is there other who can use this ability as well with the update?


Ye "Reference White (nits)" is greyed out for me, in SDR and HDR mode. I don't particularly miss that option considering that well made HDR content looks fantastic and I didn't touch that option before the updated firmware anyway. It being greyed out for no reason sure is another thing to rub into ASUS' face while trying to push a refund though.


----------



## kx11

do you guys run into the same glitch ?


it can be solved by unplugging the DP cable from the GPU but it happens randomly in al drivers and all RTX GPUs i tested


----------



## fleggy

Try SHIFT+CTRL+WIN+B to reset video driver (not tested yet). For me, the simplest solution was toggling Over Clocking setting in the monitor's OSD.


----------



## Malinkadink

Billy McDowell said:


> I thought I would mention that microcenter is selling this for $1499 and i got bestbuy to price match it online via chat. I did bestbuy because i can buy a 4 year warranty from them for $103 and considering all the issues this has it is worth it to have them exchange it for me for store credit to get another one in the future. I honestly buy all my expensive stuff i expect to break early from bestbuy.
> 
> https://www.microcenter.com/product...r-aura-sync-pre-calibrated-gaming-led-monitor
> 
> https://www.bestbuy.com/site/asus-r...onitor-with-hdr-black/6260905.p?skuId=6260905


This monitor would actually be pretty easy to sabotage i reckon lol, break the fan gsync FPGA overheats, and poof you got a non working monitor. In 3-4 years time they won't even be carrying these things anymore most likely so you'll get a newer model xD


----------



## Glerox

kx11 said:


> do you guys run into the same glitch ?
> 
> 
> it can be solved by unplugging the DP cable from the GPU but it happens randomly in al drivers and all RTX GPUs i tested


It happens from time to time. Just rebooting my PC worked everytime for me.


----------



## JMCB

Billy McDowell said:


> I thought I would mention that microcenter is selling this for $1499 and i got bestbuy to price match it online via chat. I did bestbuy because i can buy a 4 year warranty from them for $103 and considering all the issues this has it is worth it to have them exchange it for me for store credit to get another one in the future. I honestly buy all my expensive stuff i expect to break early from bestbuy.
> 
> https://www.microcenter.com/product...r-aura-sync-pre-calibrated-gaming-led-monitor
> 
> https://www.bestbuy.com/site/asus-r...onitor-with-hdr-black/6260905.p?skuId=6260905


How did you do this? After purchase or before?

Edit: NM, I shouldn't have been in a hurry. Figured it out (Price Match link on bottom). I'm thinking about doing this. I currently have the PG279Q 2k monitor, but I kind of want 4k with HDR...


----------



## animeowns

fleggy said:


> Try SHIFT+CTRL+WIN+B to reset video driver (not tested yet). For me, the simplest solution was toggling Over Clocking setting in the monitor's OSD.


Didn't know about this shortcut thanks I used to have this same issue on my pg27uq I would just unplug the power from the monitor and plug it back in to fix.


----------



## kot0005

I Switched to a LG 27gl850b after getting a refund for my pg27uq and my god....1440p is soooo pixelated on 27inch..Upscaling to 4k was so good on Pg27UQ. The IPS glow sucks too. FALD Halos are wayyy better.


----------



## profundido

kot0005 said:


> I Switched to a LG 27gl850b after getting a refund for my pg27uq and my god....1440p is soooo pixelated on 27inch..Upscaling to 4k was so good on Pg27UQ. The IPS glow sucks too. FALD Halos are wayyy better.


I soooo hear you. The extra PPI density was "just a luxury" that I didn't really need. That is, until I got used to it...now there's no way back to 'low QHD' res lol !


----------



## kot0005

profundido said:


> I soooo hear you. The extra PPI density was "just a luxury" that I didn't really need. That is, until I got used to it...now there's no way back to 'low QHD' res lol !



I am having real issues  I watch a lot of media and none of it is as crisp as on my UQ.


----------



## Malinkadink

kot0005 said:


> I am having real issues  I watch a lot of media and none of it is as crisp as on my UQ.


Push it back further or sit a little bit further back? Why did you even return the PG27UQ to effectively "downgrade" to the LG? I'm quite content on 24" 1440p but do want a little more density i'd be good 32" 4k i think thats the most ideal size for 4k while having good density. 27" 4k is really nice too but for immersion 32" just fits the bill better. I'm not going to get any of the current crop of 4k 144hz monitors though, they're too gimped by DP 1.4. Will have to wait for DP 2.0 but by then I'd probably already have bought an LG C10 lol.


----------



## tinykitten

I'm in RMA process due to my fan being broken most likely and the insides getting fried sooner or later. I just received a replacement unit and oh my god it's f'ing garbage. Scuffs all around (previous owner clearly wasn't careful whatsoever), cable cover has broken pins. I counted more than 10 dead pixel. Fun times. 


How long did it take you to get your refund approved? Did ASUS approve or your retailer? @kot0005


----------



## kot0005

I got a refund because Pg35vq was announced and I thought I might as well get it with better fan control and contrast and better frames because of 1440p. But its yet to come into stock anywhere...




tinykitten said:


> I'm in RMA process due to my fan being broken most likely and the insides getting fried sooner or later. I just received a replacement unit and oh my god it's f'ing garbage. Scuffs all around (previous owner clearly wasn't careful whatsoever), cable cover has broken pins. I counted more than 10 dead pixel. Fun times.
> 
> 
> How long did it take you to get your refund approved? Did ASUS approve or your retailer? @kot0005


Took me around 2 months. Its supposed to be retailer but I had to get it from Asus because the retailed sucks balls, never buying expensive stuff from them.


----------



## kot0005

MY Pg35vq arrived. 










Watch in 1440p/4k

Ton of flickering and scanlines in Division 2

I will post updates if I find flicker in other games but scanlines exist in Gears 5 (one of the startup splash screens), Division 2 , metro exodus, Borderlands 3.

Havent noticed in control, may be because of low fps with RTX.


----------



## MistaSparkul

Flickering is a known issue on the 35 inch ultrawides. Will it get fixed? Who knows.

https://twitter.com/pcmonitors/status/1172030175057461248?s=19


----------



## CallsignVega

No way in hell I'd keep one of the 35"ers past the return period. NVIDIA took like an entire year to fix the previous 27" FALD firmware issue.


----------



## tinykitten

The best bet for now is whichever "regular" monitor floats your boat plus a C9 probably, until HDMI 2.1 gets widely supported. You're paying about the same, if not less than a 27/35" G-Sync Ultimate display. I was looking at the 35" as well, however the flickering/scanline issues show that we are partaking in a glorified beta test with these kind of panels.


----------



## kot0005

Flickering in gears 5: https://youtu.be/yTHlCAjxji4?t=61


----------



## kot0005

Witcher 3 is literally unplayable..IDK how these monitors passed for production... these panels are "Lousy" and ****e...I will post the vid link here but its flickers like nothing I have ever seen.


----------



## kot0005

So whats up with this. Are all VA panels this **** or did i get the worst ?

You guys can see flickering on the steam loading bar pixels. No scanlines or entirr screen flickering, just the bars area..

And notice in the video that this doesnt happen on my LG 27GL850 when i move steam to that screen. Also I tried both fald on and off.. it seems its the actual pixels on the panel...

Can anyone with a VA panel test this and also on the PG27uq ?



https://youtu.be/3u1CoV--HvI


----------



## kot0005

Here is Witcher 3 gloriuous flickering.

https://youtu.be/wNvK7dmkXig


----------



## MistaSparkul

Lol did you like completely ignore my post or something? Being a VA panel has absolute jack to do with this problem. It is a known defect of this crappy "gsync ultimate" display. And nvidia has yet to issue a reponse to this. If you thought you were going to get a quality display by forking over 2 grand well....think again.


----------



## kot0005

Calm down lol. I did not! I kniw its an issue with this but steam flickering seens odd.

The steam one looks like slow pixel response. Which monitor do you have ? Do your stesm download bars flicker ?


----------



## MistaSparkul

Acer X27. It doesn't flicker but like Vega said this monitor and the PG27UQ equivalent both shipped with broken 144Hz mode that took way too long to fix. We paid top dollar for these premium gsync ultimate displays only to have stupid issues like this. Don't know what nvidia is thinking...


----------



## CallsignVega

Ya and the 27" firmware fix was an easy one! I wonder how something as bad and common as this 35" flicker issue got past testing/quality control.


----------



## kot0005

PG35VQ monitor is going back  I dont know how people are using it when Flickering is so bad. Atleast the PG27UQ was usable because Gamma only affected the black's


----------



## bogie46

Only C9, all this gaming displays must die.


----------



## kot0005

bogie46 said:


> Only C9, all this gaming displays must die.


Yeaaah am not putting anything bigger than a 35inc on my desk..


----------



## bogie46

Wow this **** cost here in Germany 1900€, end of story for me. We see when Gsync update is out. Don't waste your money, you can also try wall mount, im just saying. Gaming displays must die. I try myself too with all IPS, VA, TN 240HZ. No more, this like doing circles, this is one technology and is very bad in that money. 

Wysłane z mojego LG-H930 przy użyciu Tapatalka


----------



## MistaSparkul

kot0005 said:


> bogie46 said:
> 
> 
> 
> Only C9, all this gaming displays must die.
> 
> 
> 
> Yeaaah am not putting anything bigger than a 35inc on my desk..
Click to expand...

Yeah 55 inches is totally unusable as a monitor for me as well. I keep my 55 inch OLED on a seperate table from my main desktop setup and use it only for movies and gaming with wireless peripherals. If you can make the room for it then it's totally worth it, then you can just use a high ppi 4k display as your main desktop work monitor.


----------



## m4fox90

This one has dropped all the way to $1499 on Amazon, and the X27 is $1649. Wonder what's going on there? Is the ROG version proving less popular?


----------



## Malinkadink

m4fox90 said:


> This one has dropped all the way to $1499 on Amazon, and the X27 is $1649. Wonder what's going on there? Is the ROG version proving less popular?


Nah, Asus and Acer just make up their own prices. Usually the Acer is cheaper though, but i guess Asus is trying to move stock as Asus has their mini LED version coming real soon.


----------



## Foxrun

I just picked up this monitor on sale from microcenter and the HDR is not working properly. All colors are washed out and blacks have turned to grays. I owned this monitor before and never had this issue, but the new one I have now came with the updated firmware. Any idea how to fix the HDR? Ive tested my rig on other hdr displays and everything checks out. It is only with this monitor that I have issue.


----------



## Malinkadink

I just checked Microcenter and see these for $1200, wow lol, that's pretty tempting for the FALD, however i just recently learned that people were able to successfully get the XV273K to work at 4k 144hz with freesync and HDR using two DP cables on an Nvidia GPU. That i feel would be more valuable to me than the FALD as i've been considering the new XB273K GPbmiipprzx which is the "freesync" version of the XB273K at $700 so $500 savings over the recently price dropped Asus. If i can get that monitor and run it at 144hz 4k and gsync compatibility on with two DP cables meaning uncompressed 4:4:4 8 bit or 10 bit color i'd rather go for that.

I suspect the PG27UQX is going to release very very soon and Asus is trying to ditch as much PG27UQ stock as possible, and introduce the X model at $2k. Unfortunately Gsync is limiting these high end displays with only x1 DP port, and i haven't heard anything about this new monitor supporting DSC as that would be a saving grace for the display, but Asus hasn't said anything.


----------



## MistaSparkul

Malinkadink said:


> I just checked Microcenter and see these for $1200, wow lol, that's pretty tempting for the FALD, however i just recently learned that people were able to successfully get the XV273K to work at 4k 144hz with freesync and HDR using two DP cables on an Nvidia GPU. That i feel would be more valuable to me than the FALD as i've been considering the new XB273K GPbmiipprzx which is the "freesync" version of the XB273K at $700 so $500 savings over the recently price dropped Asus. If i can get that monitor and run it at 144hz 4k and gsync compatibility on with two DP cables meaning uncompressed 4:4:4 8 bit or 10 bit color i'd rather go for that.
> 
> I suspect the PG27UQX is going to release very very soon and Asus is trying to ditch as much PG27UQ stock as possible, and introduce the X model at $2k. Unfortunately Gsync is limiting these high end displays with only x1 DP port, and i haven't heard anything about this new monitor supporting DSC as that would be a saving grace for the display, but Asus hasn't said anything.


Did they at least improve the overdrive over the first freesync XV version? IIRC the overdrive on that one was just terrible and was a factor in making the XB Gsync version worth it.


----------



## Malinkadink

MistaSparkul said:


> Did they at least improve the overdrive over the first freesync XV version? IIRC the overdrive on that one was just terrible and was a factor in making the XB Gsync version worth it.


No idea, i'll see what its all about this thursday when i get a chance to hit the store and try to compare it to XV273K reviews. If i can get 4k 144hz with freesync working with two DP cables it'll be a keeper so long as there aren't any major defects like excessive back light bleed.


----------



## AngryLobster

Had the XV273K during BF sale on Amazon for $629 and thought it was trash.

Anyway just wanted to chime in that there is some huge variance in terms of fan noise with these. My friend just bought one with a Oct 2018 manufacture date and it's literally whisper quiet. 

Mine is newer than his and has sounded like a vacuum cleaner from day one.


----------



## CallsignVega

Malinkadink said:


> No idea, i'll see what its all about this thursday when i get a chance to hit the store and try to compare it to XV273K reviews. If i can get 4k 144hz with freesync working with two DP cables it'll be a keeper so long as there aren't any major defects like excessive back light bleed.


I tried it based on that "news" and it turned out to be bunk. You cannot get Freesync working on the display with two DP cables. They though that just because you could enable it meant it was working. Not so much.


----------



## Malinkadink

CallsignVega said:


> I tried it based on that "news" and it turned out to be bunk. You cannot get Freesync working on the display with two DP cables. They though that just because you could enable it meant it was working. Not so much.


You tried it on the XV273K or the newly released XB273K without the gsync module? I just picked it up today but im at work so i wont be able to play with it until 11 hours from now. This post on reddit shows how they got dual DP + freesync working on the XV273K so i'm going to try it on the XB273K and see if i can reproduce the results. https://www.reddit.com/r/nvidia/comments/aenwrq/acer_nitro_xv273k_dual_displayport_support_on_the/





AngryLobster said:


> Had the XV273K during BF sale on Amazon for $629 and thought it was trash.
> 
> Anyway just wanted to chime in that there is some huge variance in terms of fan noise with these. My friend just bought one with a Oct 2018 manufacture date and it's literally whisper quiet.
> 
> Mine is newer than his and has sounded like a vacuum cleaner from day one.


The XV273K has a fan? I thought only the Gsync ones did.


----------



## Exilon

You're wasting your time with the IPS non-FALD ones, IMO. The panel used has uniformity issues, narrower than usual viewing angles, and strong IPS glow. FALD helps to hide those issues. Not so much on the non-FALD versions.

The XV273K never worked with dual cables + HDR or Freesync. Dual cables turns it into a SDR static refresh monitor with wacked color gamut.

Even if the XB273KGP can do Freesync + HDR + dual cables, the HDR is not worth using at all. 400 nits and 1000:1 contrast ratio just means everything is washed out in addition to extreme IPS glow.

According to this here, the situation with the XB273KGP is the same as the XV273K. No VRR with dual cables.

https://www.reddit.com/r/Monitors/c...r_xb273kgp_xb273ks_xb273kp_whats_the/f34mh34/

I'd reckon the overdrive is the same as well.


----------



## Malinkadink

Exilon said:


> You're wasting your time with the IPS non-FALD ones, IMO. The panel used has uniformity issues, narrower than usual viewing angles, and strong IPS glow. FALD helps to hide those issues. Not so much on the non-FALD versions.
> 
> The XV273K never worked with dual cables + HDR or Freesync. Dual cables turns it into a SDR static refresh monitor with wacked color gamut.
> 
> Even if the XB273KGP can do Freesync + HDR + dual cables, the HDR is not worth using at all. 400 nits and 1000:1 contrast ratio just means everything is washed out in addition to extreme IPS glow.
> 
> According to this here, the situation with the XB273KGP is the same as the XV273K. No VRR with dual cables.
> 
> https://www.reddit.com/r/Monitors/c...r_xb273kgp_xb273ks_xb273kp_whats_the/f34mh34/
> 
> I'd reckon the overdrive is the same as well.


There's a workaround for VRR with dual cables using the acer display widget as mentioned in my other post that has a link to reddit. I never intended to use the HDR anyways but i would like 144hz 4:4:4 8 bit with VRR which i think i can get to work if its similar to the XV273K. Besides the 1000 nit local dimming monitors any other HDR monitor isn't worth its salt, that's pretty obvious. I still have my C7 OLED if i want to watch HDR content. This monitor would be a good one to use until the PG27UQX or next years DP 2.0 monitors at which point i'd make the XB273K my secondary monitor.


----------



## Exilon

That link in reddit is bunk. They didn't actually test if the VRR worked, just that it says it turned on.


----------



## CallsignVega

Not to mention this IPS panel without the FALD looks atrocious. I returned both the Freesync and G-Sync versions. Best get the X27 or PG27UQ.


----------



## MistaSparkul

CallsignVega said:


> Not to mention this IPS panel without the FALD looks atrocious. I returned both the Freesync and G-Sync versions. Best get the X27 or PG27UQ.


Definitely. Those monitors are somewhat deeply discounted now compared to over a year ago. The PG27UQX is not worth it at all since at best, it's going to reduce blooming by about 33% compared to the current FALD, and that's coming straight from Asus themselves. ~$1200 for a PG27UQ or ~$2000+ for a PG27UQX to have up to 33% less blooming? That's a no brainer.

"Rather than using conventional LEDs, which are between 700 and 800 µm, our new ROG Swift PG27UQX employs much smaller Mini LEDs that are roughly 200 – 300 µm. It spans the same 27” but is divided into 576 zones, each of which uses four LEDs for a total of 2,304 LEDs. The increase means bloom can be reduced by as much as 33% compared to our already-awesome PG27UQ. "


----------



## CallsignVega

Ya it is funny that they try to make the numbers sound larger with the number of LEDs. But that is inconsequential, it's the matter of zones that matters. At Microcenter you can get the PG27UQ for $1,200 which is a pretty great deal. Unless you must absolutely have the latest and get the slightly more zone PG27UQX for $800 more and with an unknown release date. 

I'll never get a non FALD IPS again as the IPS glow and BLB are huge pet peeves of mine. (Although I may break my rule one more time and try the 38GL950G-B edge lit IPS as it's curved and may reduce some of the IPS glow.)


----------



## kot0005

Guess what.... i went to the pax rog booth today, they had 3 pg35vq setup. All 3 had flickering.


----------



## Exilon

I really wanna know who OK'd the stuff and how many working PG35VQ actually exist.


----------



## kot0005

Exilon said:


> I really wanna know who OK'd the stuff and how many working PG35VQ actually exist.



Prolly zero because I have seen the exact same behaviour on 4 PG35VQ's. Literally took me under 5mins to reproduce it. It only happens in certain scenes, like on the trees in witcher etc. So people who are using them prolly dont have good knowledge of monitors or arent able to see the flickering/scanlines..


----------



## CallsignVega

Firmware problem, which means it affects 100% of them. I'd bet Kot0005's paycheck on it.


----------



## tinykitten

I'm using a PG35VQ for now and to be honest it's a much better experience compared to my PG27UQ. I definitely noticed flickering on the Steam download bar however I haven't seen anything yet that would disturb me in games (RE2/RE7, SotTR, BDO, League of Legends, Alien Isolation). I consider myself quite picky when it comes to displays so take that as you will. I'm not sitting 5cm in front of the panel looking for issues, I wanted to get a unbiased opinion and that's it pretty much. That being said: I highly doubt I'll keep the PG35VQ, however I'm not sure where to go to for a stopgap until 34/35" OLED. I cba to go back to backlight bleed galore, and a C9 is too large for my personal taste. Realistically speaking first gen OLED monitors being fault free is wishful thinking so there's that too.


Edit: Spoke too soon I guess! I just came across something in RE2 which is actually worse than what I've seen in the usual Battlefield videos and such.


----------



## kot0005

tinykitten said:


> I'm using a PG35VQ for now and to be honest it's a much better experience compared to my PG27UQ. I definitely noticed flickering on the Steam download bar however I haven't seen anything yet that would disturb me in games (RE2/RE7, SotTR, BDO, League of Legends, Alien Isolation). I consider myself quite picky when it comes to displays so take that as you will. I'm not sitting 5cm in front of the panel looking for issues, I wanted to get a unbiased opinion and that's it pretty much. That being said: I highly doubt I'll keep the PG35VQ, however I'm not sure where to go to for a stopgap until 34/35" OLED. I cba to go back to backlight bleed galore, and a C9 is too large for my personal taste. Realistically speaking first gen OLED monitors being fault free is wishful thinking so there's that too.
> 
> 
> Edit: Spoke too soon I guess! I just came across something in RE2 which is actually worse than what I've seen in the usual Battlefield videos and such. https://www.youtube.com/watch?v=dqKCvNeyHak



Yes this issue is on every monitor 10000% I mean I check 4 of them and found the problem in under 5mins..

you gotta turn off all that ambient light before recording a vid.


----------



## CallsignVega

All of these monitors should be returned.


----------



## tinykitten

Ye, C9 is on the way and the PG35VQ is getting picked up tomorrow for return. 



People on r/ultrawidemasterrace file this under ghosting, comparing it to what you'd see on a 240Hz panel and well.. Justifying a purchase is fine and all if that makes you sleep at night; however this issue certainly doesn't look like ghosting to me.


----------



## CallsignVega

Ya, so why doesn't any other monitors entire picture flicker when moving then?


----------



## MistaSparkul

CallsignVega said:


> Ya, so why doesn't any other monitors entire picture flicker when moving then?


Exactly. Plenty of VA panels have piss poor response times but don't exhibit flickering on this level. This is beyond some typical VA panel response time problem.


----------



## Mr.Vegas

tinykitten said:


> Ye, C9 is on the way and the PG35VQ is getting picked up tomorrow for return.
> 
> 
> 
> People on r/ultrawidemasterrace file this under ghosting, comparing it to what you'd see on a 240Hz panel and well.. Justifying a purchase is fine and all if that makes you sleep at night; however this issue certainly doesn't look like ghosting to me.


Same here, after i used OLED C6P since 2016 i got used to size and quality, I was thinking that 35inch will be big enough and that every game needs GSYNC and that 120+Hz i great and that the new display port will allow me to use 120hz/12bit [it does], but now that i finally got the monitor and its trash with bugs from here to the moon [not just flickering and scan lines, i had otehr issues] and it looks small by comparison, i will be getting C9, which is cheaper too [I had to import the asus from abroad, paid extra 1K USD for shipping and import tax] and C9 i can get local with 3 years warranty and soon it gets GSYNC support and i bet that next gen nvidia cards will have HDMI 2.1 so that means native 4K/120hz, for now gsync at 4k/60 will do it, even my 2080ti cant push 60 in some games Ultra settings especially with RTX, if the range is low enough, say 30 to 60 ill be happy and when video cards with HDMI 2.1 come out we will get 12Bit/120Hz 4K too


----------



## ahnafakeef

Can we please start r/C9MasterRace?


----------



## kot0005

tinykitten said:


> Ye, C9 is on the way and the PG35VQ is getting picked up tomorrow for return.
> 
> 
> 
> People on r/ultrawidemasterrace file this under ghosting, comparing it to what you'd see on a 240Hz panel and well.. Justifying a purchase is fine and all if that makes you sleep at night; however this issue certainly doesn't look like ghosting to me.


Its not Ghosting...ghosting doesnt put on a light show and bleed your eyes..I literally couldn't play witcher 3 and I never had eye issues from looking at monitors for a long period of time. My eyes got teary in under 15mins of playing witcher 3 on the PG35VQ


----------



## tinykitten

kot0005 said:


> Its not Ghosting...ghosting doesnt put on a light show and bleed your eyes..I literally couldn't play witcher 3 and I never had eye issues from looking at monitors for a long period of time. My eyes got teary in under 15mins of playing witcher 3 on the PG35VQ


It sure isn't, yet you have this one special snowflake (the one with Asus/ROG "connections") invading almost every forum, claiming it is ghosting after his r/ultrawidemasterrace post got called out. Not sure if that guy is arguing for the sake of arguing and *needs* to be right at all cost in his eyes but it is what it is, it's the internet after all. At the end of the day customers shouldn't be ok with issues like this.


----------



## MistaSparkul

tinykitten said:


> It sure isn't, yet you have this one special snowflake (the one with Asus/ROG "connections") invading almost every forum, claiming it is ghosting after his r/ultrawidemasterrace post got called out. Not sure if that guy is arguing for the sake of arguing and *needs* to be right at all cost in his eyes but it is what it is, it's the internet after all. At the end of the day customers shouldn't be ok with issues like this.


You've also got a special snowflake on this forum who keeps claiming that the flickering is caused by "typical slow VA panel response times". Lol some people will just believe whatever they wanna believe. Ghosting/slow VA pixels has nothing to do with this defect.


----------



## kot0005

tinykitten said:


> kot0005 said:
> 
> 
> 
> Its not Ghosting...ghosting doesnt put on a light show and bleed your eyes..I literally couldn't play witcher 3 and I never had eye issues from looking at monitors for a long period of time. My eyes got teary in under 15mins of playing witcher 3 on the PG35VQ
> 
> 
> 
> It sure isn't, yet you have this one special snowflake (the one with Asus/ROG "connections") invading almost every forum, claiming it is ghosting after his r/ultrawidemasterrace post got called out. Not sure if that guy is arguing for the sake of arguing and *needs* to be right at all cost in his eyes but it is what it is, it's the internet after all. At the end of the day customers shouldn't be ok with issues like this.
Click to expand...



Yes, he prolly should not have said he had deep connections with Asus R&D.

He also keeps going on about how he bought the monitor from ROG store, like that even exists ?? And keeps taking about how he got it before everyone else. 

He prolly got it for free tbh. Tge monitors werent even available until last month. He also got his around the same time as reviewers, not co incidence.


----------



## HyperMatrix

After over a year of owning 2 of these monitors, I have to say I wish I had never bought it. A few key reasons:

1) Lack of GPU power to properly use the monitor (I blame Nvidia/RTX Fail Series for this). I've stuck to playing mostly racing games or rpg games that can be played with a controller. But....when I play games I can play with a controller....I prefer playing them on my TV. And when I lower the graphics settings or internal resolution scale on FPS games to be able to play them at a fluid rate, I remember how much better it would look when I was actually playing on a 1440p monitor. I went from doing 99% of my gaming on a PC monitor, to 50% on monitor and 50% on my TV now.

2) The matte coating, and lack of black contrast layer. Comparing it to my TV, the glossy pure black that Samsung uses makes images look so much more rich and vibrant. This is sorely lacking. 

3) The FALD system. It's horrible. On the desktop when you move your mouse cursor, the entire color/brightness of the large 1" or so area around the cursor changes. There is no proper blending of the light zones either. And again, because it's missing that deep black coating that Samsung uses, it becomes impossible to block out minor bloom/haloing. So with the matte coating, and the poor FALD system, you get a worse contrast experience than you would with even a good edge lit HDR samsung TV. In FPS games there will also be a light blob around your crosshairs in the middle of the screen, or around any light source in the dark. It's really all very poorly done and despite showing off how hard it was to implement FALD with GSYNC, which I believe it was, it's a terrible experience. 

4) GSYNC/HDR switching timing. Tabbing into/out of fullscreen games takes several seconds. On my TV I know it takes a second and a half to switch to HDR mode, but this is a lot longer, and annoying as hell. Especially when you're playing a fullscreen game, and you adjust the system volume, and windows pops up the little volume hud, which disables HDR momentarily, then takes a couple seconds to get you back into the mode. 

5) 98Hz actual 10-bit HDR. I'd prefer 98Hz on the desktop, with 120/144Hz in games. And sometimes it works. But sometimes it doesn't. And it gets worse if you decide to wish a game could be played in 98Hz instead, but the game has no refresh rate controls. So then you have to go into nvidia control panel and set the refresh rate you'd like to use for that individual game. Overall it feels like a stupid compromise for such a supposedly premium product. This gripe is probably the least credible of the bunch, and this was known about the monitor before I even bought it.

Overall, I feel this is a beta product with a premium price tag. Too many corners were cut, and no thought was given to what customers wanted. It was developed under the same policy that gave us the RTX series. Lackluster crap that has potential, but is far too expensive, and falls well short of what's required for any of its advertised features to be of any use.

So to sum it up, I would NOT recommend buying this or similar monitors until a few things are changed:

- Full 144Hz 10-bit support
- Micro LED FALD
- Gloss deep black layer with anti-reflective properties like on Samsung TVs

You don't have to listen to me. But you'll be sorely disappointed, considering the price. I'd say an OLED with 120Hz and VRR would be a better investment and a far better experience if you're careful with image burn in.


----------



## saltedham

i feel ya hypermatrix. i sacrificed good 1440p frame rates for blooming, a few games i own that support hdr that makes blooming worse and a fan i can hear inside the monitor. 4k does look nice though.

i do like that the monitor displays a console on the hdmi port properly. the pg279q i have wouldnt unless i unplug the monitor then plug back on.


----------



## kot0005

saltedham said:


> i feel ya hypermatrix. i sacrificed good 1440p frame rates for blooming, a few games i own that support hdr that makes blooming worse and a fan i can hear inside the monitor. 4k does look nice though.
> 
> i do like that the monitor displays a console on the hdmi port properly. the pg279q i have wouldnt unless i unplug the monitor then plug back on.


They fixed the fan sound in PG35VQ but that monitor has its own issue  so hard to find a good monitor even when paying so much these days..


----------



## tinykitten

kot0005 said:


> They fixed the fan sound in PG35VQ


Not sure sure about that, I perceived the fan sound more obnoxious compared to my old PG27UQ. Probably a unit by unit case. I never had the fan adjust speed according to load, contradictory to the smart fan description on the product page.


----------



## kot0005

tinykitten said:


> Not sure sure about that, I perceived the fan sound more obnoxious compared to my old PG27UQ. Probably a unit by unit case. I never had the fan adjust speed according to load, contradictory to the smart fan description on the product page.


my PG35VQ was super silent compared to my Pg27UQ they Pg27uq would also constantly ram the fan up and down. It was so annoying. The fan ended up dying and ran at full speed after 8months of use.


----------



## animeowns

kot0005 said:


> Guess what.... i went to the pax rog booth today, they had 3 pg35vq setup. All 3 had flickering.



did you ask them if they will release an update to fix the flickering ? I just knew something bad would have to come out of these hdr 1000 ultrawides with how long the delay was for them to be released.


----------



## kot0005

animeowns said:


> did you ask them if they will release an update to fix the flickering ? I just knew something bad would have to come out of these hdr 1000 ultrawides with how long the delay was for them to be released.


No, they are just regular staff.


In the meantime PA32UCX is in stock here

https://www.ebay.com.au/itm/ASUS-Pr...R-Mini-LED-Professional-Monitor-/283616707687


----------



## axiumone

kot0005 said:


> No, they are just regular staff.
> 
> 
> In the meantime PA32UCX is in stock here
> 
> https://www.ebay.com.au/itm/ASUS-Pr...R-Mini-LED-Professional-Monitor-/283616707687


This isn't the droid you're looking for. UXC is the 60hz version. UCG is the 120hz that we all want.


----------



## kot0005

axiumone said:


> This isn't the droid you're looking for. UXC is the 60hz version. UCG is the 120hz that we all want.


I think that one will be like $6000 here lol.


----------



## animeowns

kot0005 said:


> No, they are just regular staff.
> 
> 
> In the meantime PA32UCX is in stock here
> 
> https://www.ebay.com.au/itm/ASUS-Pr...R-Mini-LED-Professional-Monitor-/283616707687


that is the 60hz version you need to wait on this one Asus Proart PA32UCG The Ultimate Mini LED 4K 120hz Monitor with HDR 1600

https://www.anandtech.com/show/1483...mate-mini-led-4k-120-hz-monitor-with-hdr-1600


----------



## bmgjet

Any one had issue with the underscreen stand LED flickering or know if you can turn it off.


----------



## sblantipodi

sblantipodi said:


> guys I'm experiencing some strange behaviour on my XB273K monitor
> 
> on some HDR youtube videos I'm experiencing white crush.
> 
> sometimes the white tones are so high that I completely loose details in the bright area.
> 
> let's make "more" scientific xD
> 
> when HDR is enabled in both monitor and windows,
> 
> https://www.youtube.com/watch?v=_XRbTQk45vQ
> 
> in this video to test for white clipping I can see clear difference until 155 nit, the sixth tile.
> 
> I don't think that this is good. am I the only one with this problem. is it possible that this monitor is so bad in HDR?


ok I probably found the problem.

it seems that the










sdr content appearance slider affect my youtube videos.

I have that slider at 40 and it causes white clipping even on youtube HDR video.

why? isn't that slider only relative to SDR content?

setting that slider to 0 completely solved all my white clipping problems :O

why I don't understand.


----------



## mattxx88

dunno what others do, but i use to keep the panel in sdr mode
i just turn it on if i need to see some youtube video, cause mostly all recent games when hdr is enabled they auto enable the hdr.
i think windows is not yet ready for hdr on desktop


----------



## Alex24buc

Hello I just bought pg27uq but I have some dust under display in one place. It is not too bad. Should I replace it? I am afraid that the replacing unit will have other issues.


----------



## mattxx88

Cannot find and info about, do you know if our baby supports freesync?


----------



## estebangg

Lumbeechief081 said:


> *I'm f**ked!*
> 
> 
> 
> 
> I updated my monitor's firmware and now HDR isn't working properly! I can't change the Reference White (nits) because it's greyed out. Everything just looks so dark and dull in HDR now, even on the PS4 PRO. HDR content is completely unviewable for me now. I wish I didn't update it! What am I supposed to do now? I don't want to wait over a month to receive my monitor back from ASUS's RMA department. The ASUS PG27UQ is the only monitor I have to use on my PC. I absolutely have no other monitor. I'll be without a monitor if I send the ASUS PG27UQ in for RMA. They don't even have an advance RMA replacement where they send me a replacement and then I ship the defective one back. WṬF? Why can't there be a way to downgrade the firmware???? This is a f**king headache! :cryingsmi:cryingsmi:cryingsmi:cryingsmi:cryingsmi


hey man, i've been investigating this awful upgrade for months, i've sent my monitor to the RMA several times trying to get a properly monitor with the HDR working properly but i never get it back again, my history is that i bought a used monitor like your in amazon, the fan was spinning loud and i thought it was a real problem so i decided sent to the RMA, the thing is that the image on HDR on this first monitor was incredible good, then of get another monitor of the RMA i never get back one monitor with the HDR of the first, the first monitor i had was incredible and all the rest i got just crap, i also had that problem you had about the update with other monitors i got from the RMA, think there is two things here, first is the model of the panel, i think depending when your monitor was manufactured the panel is different and second is the firmware installed, i've checked that the monitors manufactured in september of 2019 was the best monitors with the best panels and firmware ok about the HDR, just today i got a new monitor from the RMA and the HDR remains ****, the SDR image is incredible good, this one has a great colours but the HDR is just **** and guess what, when i tried to update the firmware with the tool was already updated, you said that you did not know when you buy it, you don't need to remember it, just in the serial number sticker you must have the month and the year manufactured, i think is my 5th monitor PG27UQ, men i can't get back that amazing HDR i got from the first 😓😓😓when i read your message i was laughing very much because it's exactly what happened to me, i have probably much much information than you about all this


----------



## Toothrot

As soon as I turn on HDR and start CoD, for example, the screen restarts every few seconds and doesn't stop. Has anyone else had this problem?
Hope for your help


----------



## Fanu

I'm sure none of you wants to hear this, but this monitor is prime example of why not to buy first generation of any (expensive) product - especially one with moving parts.. 
there are users who don't have any issues with this monitor, but there are plenty who do and can do nothing about it (even when RMAd they still receive subpar unit..) 

HDR gaming monitors are still a mess in pc world - until OLED or microLED monitors are released there is no point in spending so much money on current gen monitors


----------



## Toothrot

Fanu said:


> I'm sure none of you wants to hear this, but this monitor is prime example of why not to buy first generation of any (expensive) product - especially one with moving parts..
> there are users who don't have any issues with this monitor, but there are plenty who do and can do nothing about it (even when RMAd they still receive subpar unit..)
> 
> HDR gaming monitors are still a mess in pc world - until OLED or microLED monitors are released there is no point in spending so much money on current gen monitors


I was looking forward to the HDR function of the screen and it just doesn't work.
When I send the screen in, I have to wait forever to get it back. 
I don't know why this monitor is so well rated.


----------



## dboythagr8

Does anybody know the size of screw used to hold the monitor on the PG27UQ stand? I removed it from the stand to use a VESA mount, and of course I can't locate the original screws to put it back on the stand now that I've changed my setup.


----------



## pat182

3years with mine, still the best monitor ever, only thing is if im using the power button it can cause some weird booting issue so i let the pc shut down and to monitor goes to sleep anyway, hdr still look amazing, doing regular nit testing and still hold up to 1300nit


----------



## pat182

i guess its time to sell for the p;g31uqx


----------



## HyperMatrix

pat182 said:


> i guess its time to sell for the p;g31uqx


I dunno. Not a substantial upgrade imo. $3000 USD, and has just 1152 backlight zones in 32 inches. iPad pro has 2500 zones (10,000 individual LEDs but controller probably can't handle managing that many zones yet) in 12.9" So the iPad Pro backlight zones are about 5mm in diameter. Not to mention the iPad Pro is also using a nice glossy glass screen which enhances the image even further. 

In contrast, the PG32UQX backlight zones are 16mm in diameter. In terms of area/size difference, that means each zone on this monitor is 10x bigger than the zones on the iPad Pro. 1152 backlight zone technology is already old and it doesn't address the issues we have with the first monitor. The only real benefit would be if you prefer the 32" screen size (which I agree would be better), along with DSC being able to maintain 144Hz without a noticeable drop in image quality. Being HDR1400 has no benefit unless you're somehow playing with your backlight maxed out in HDR mode already and feel you'd like to speed up the process of macular degeneration.

Just as a quick recap...these are the advantages of paying $3000 for the new monitor. Some may find it worthwhile. Just listing it to make it easier to decide:


Larger screen size (32" vs 27")
Significantly slimmer bezels
Somewhat smaller backlight zones (23mm diameter to 16mm diameter, or roughly half the size in terms of area)
DSC to enable 144Hz without overcompression
VRR through HDMI for consoles/etc

For me...I'll consider it if the price drops to about $2000 USD. Right now, this is a monitor using 2-3 year old technology but carrying cutting edge pricing. Of course the preorders have already sold out because right now anything will sell. But it's not a great purchase if you already own the 27" model. Unless you have the money to upgrade frequently. Because there is likely to be an upgraded model next year with Displayport 2 support and higher refresh rates compatible with next gen cards cards. Likely with more backlight zones as well. 

But again...if you can afford it and money isn't an issue...it is an upgrade over the 27" model. How much disposable income you have will determine whether it's a worthwhile purchase or not.


----------



## pat182

HyperMatrix said:


> I dunno. Not a substantial upgrade imo. $3000 USD, and has just 1152 backlight zones in 32 inches. iPad pro has 2500 zones (10,000 individual LEDs but controller probably can't handle managing that many zones yet) in 12.9" So the iPad Pro backlight zones are about 5mm in diameter. Not to mention the iPad Pro is also using a nice glossy glass screen which enhances the image even further.
> 
> In contrast, the PG32UQX backlight zones are 16mm in diameter. In terms of area/size difference, that means each zone on this monitor is 10x bigger than the zones on the iPad Pro. 1152 backlight zone technology is already old and it doesn't address the issues we have with the first monitor. The only real benefit would be if you prefer the 32" screen size (which I agree would be better), along with DSC being able to maintain 144Hz without a noticeable drop in image quality. Being HDR1400 has no benefit unless you're somehow playing with your backlight maxed out in HDR mode already and feel you'd like to speed up the process of macular degeneration.
> 
> Just as a quick recap...these are the advantages of paying $3000 for the new monitor. Some may find it worthwhile. Just listing it to make it easier to decide:
> 
> 
> Larger screen size (32" vs 27")
> Significantly slimmer bezels
> Somewhat smaller backlight zones (23mm diameter to 16mm diameter, or roughly half the size in terms of area)
> DSC to enable 144Hz without overcompression
> VRR through HDMI for consoles/etc
> 
> For me...I'll consider it if the price drops to about $2000 USD. Right now, this is a monitor using 2-3 year old technology but carrying cutting edge pricing. Of course the preorders have already sold out because right now anything will sell. But it's not a great purchase if you already own the 27" model. Unless you have the money to upgrade frequently. Because there is likely to be an upgraded model next year with Displayport 2 support and higher refresh rates compatible with next gen cards cards. Likely with more backlight zones as well.
> 
> But again...if you can afford it and money isn't an issue...it is an upgrade over the 27" model. How much disposable income you have will determine whether it's a worthwhile purchase or not.


well, ill only get it for the size increase and if i can sell mine for 2kcad$ if not, wont change


----------



## HyperMatrix

pat182 said:


> well, ill only get it for the size increase and if i can sell mine for 2kcad$ if not, wont change


Over the last few months several have sold on Kijiji for $1000 CAD where I live. If you can get $2000 CAD for it then yeah I'd definitely do it.


----------



## pat182

HyperMatrix said:


> Over the last few months several have sold on Kijiji for $1000 CAD where I live. If you can get $2000 CAD for it then yeah I'd definitely do it.


thats kinda low for the best monitor in town, aint dropping 75% value in 3 year


----------



## HyperMatrix

pat182 said:


> thats kinda low for the best monitor in town, aint dropping 75% value in 3 year


I monitored (pun. giggity) it over the past few months both in Calgary and Edmonton. Some had it listed at $1200 and the ad was up for weeks. Ended up selling for $1000. I know because I told a couple friends to buy them for $1000. I think even $1200 is a bargain but most people aren't buying a monitor they can't run.  And if they had the money for a card like the 3090, they probably would have bought this monitor earlier. Also important to note that Memoryexpress and others had the monitor on sale for between $1600-1800 CAD over the past couple of years and Microcenter currently has it listed at $1200 USD (just under $1500 CAD). Considering what you get for $1200 USD....I can't see the justification in the extra $1800 USD to get the features I mentioned above. If it were using a more dense backlight system like the iPad Pro, I'd buy it immediately. But alas...as I mentioned...each of its backlight zones is 10x bigger than on the iPad Pro. So the iPad Pro will give you something similar to OLED blacks/contrast. This will give slightly less halo/glow. And the problem is...the PG27UQ has MASSIVE halo/glow in dark HDR scenes (like the first level of Metro Exodus) to the point where it fades out the actual content on the screen. So the slight reduction in zone size with this new monitor is a slight improvement, but 90% of the problem still remains. And as I mentioned...still 144Hz limited is unfortunate.


----------



## mattxx88




----------



## HyperMatrix

mattxx88 said:


>


Yup. As expected. Still massive halo around even small objects in dark scenes. Like crosshairs in a dark area. It's even worse in HDR. Not worth the asking price. WTB the new iPad Pro as a 27-32" monitor pl0x.  Either that or OLED.


----------



## kx11

So this is the 3000$ monitor? with HDMI 2.0?!!


----------



## pat182

HyperMatrix said:


> I monitored (pun. giggity) it over the past few months both in Calgary and Edmonton. Some had it listed at $1200 and the ad was up for weeks. Ended up selling for $1000. I know because I told a couple friends to buy them for $1000. I think even $1200 is a bargain but most people aren't buying a monitor they can't run.  And if they had the money for a card like the 3090, they probably would have bought this monitor earlier. Also important to note that Memoryexpress and others had the monitor on sale for between $1600-1800 CAD over the past couple of years and Microcenter currently has it listed at $1200 USD (just under $1500 CAD). Considering what you get for $1200 USD....I can't see the justification in the extra $1800 USD to get the features I mentioned above. If it were using a more dense backlight system like the iPad Pro, I'd buy it immediately. But alas...as I mentioned...each of its backlight zones is 10x bigger than on the iPad Pro. So the iPad Pro will give you something similar to OLED blacks/contrast. This will give slightly less halo/glow. And the problem is...the PG27UQ has MASSIVE halo/glow in dark HDR scenes (like the first level of Metro Exodus) to the point where it fades out the actual content on the screen. So the slight reduction in zone size with this new monitor is a slight improvement, but 90% of the problem still remains. And as I mentioned...still 144Hz limited is unfortunate.


overall im still happy with it, the only real glow is in loading sreens , in game even if it does some bloom, could argue that it does like a natural bloom haha

some dark scenes can be problematic but in general its a solid hdr experience


----------



## Kashtan

My PG27UQ started making noise with his fans. It’s unpleasant.
Like a vacuum cleaner. After an hour of this torture, it becomes a little quieter but still works louder than anything. Much.
I tried to remove the back wall, but nothing works.
I googled the question of the fan noise of this model. People change thermal pastes, fans, even install water cooling. But no one ever wrote that he had a problem to remove the back panel.
And I have this problem.
I even bought a laptop repair kit, there are tools to remove from the latches. Retainers, clamps and more. They do not fit into the gap. They themselves scratch and scratch the plastic of the monitor. This is nonsense for me. What am I doing wrong? How do I remove the back panel to get to those damn coolers?
When I paid that kind of money, I never thought that I would have problems. All the more so.


----------



## Pulsar4K

Blackvette94 said:


> Here are the pics of removing the anti glare on the pg27uq! By far the most difficult monitor or tv I have ever taken apart and I have done may 25-30 now.
> 
> The anti glare came off after 3hrs of damp paper towels with distilled water. No glue residue left over, just a pristine clear glass screen 🙂
> 
> Benefits of this mod:
> 
> Significant clarity due to high ppi 4k at 27 inches
> Significant increase in brightness
> Contrast increase is substantial
> Blacks look liquid now and picture overall looks like looking out a window :0
> 
> You need special plastic tools to open the case without damaging it, the ag filter is so very fine and cut in the exact shape of the polarizer that you have to be very careful when removing it!
> 
> I would be glad to do this mod for others on the x27 and pg27uq but I won’t be doing it for free due to the high degree of difficulty. This mod makes this display look next gen and shame on Asus for at least not giving us an option to have glossy vs matte 😞
> 
> Shame on Vega and l88bastard for not believing me that I did this :’(
> 
> Now onto the pics:


I also dislike AG-Coatings, so i've already removed the matte coating from different monitors (like the PG278Q and some older monitors). So i already know the amount of water to use and to peel the light transparent AG-Coating and not the darker tinted polarizer.
But as you probably know, there is sometimes the problem of the "cracking" of the polarizer when the coating was removed and some time has passed. The tiny microcracks will lead to colored small lines as the polarizer is not smooth anymore. I had this problem on the PG278Q and one very old Eizo VA monitor. It only developed after some hours of use on the Eizo. I did not even tried to clean the monitor (i know it can be a problem cleaning the warm panel with a cloth with water)
So i wanted to know if you had similar issue on the PG27UQ or if you was able to keep the pristine picture quality for a long time?
And if you have some tips for the, as you said, complicated dissassembly of the monitor?
Sorry for my bad english.
Thanks a lot!


----------

