>buy 15 trillion billion different “gsync compatible” monitors. >mfw they all flicker

>buy 15 trillion billion different “gsync compatible” monitors
>mfw they all flicker
Please recommend me a good 1440p 165hz gsync monitor that doesn’t have brightness flickering issues, please, I’m at my wits end.

Thalidomide Vintage Ad Shirt $22.14

Nothing Ever Happens Shirt $21.68

Thalidomide Vintage Ad Shirt $22.14

  1. 2 years ago
    Anonymous

    idk, my 27gn800-b works just fine, buy the more expensive 83a version though i hear it's good

  2. 2 years ago
    Anonymous

    I play many games at 4k on a 144hz panel and have never felt the need for freesync or gsync in any game

    • 2 years ago
      Anonymous

      Get your eyes checked. Properly implemented VRR will make low/mid FPS feel much better, especially when your GPU is struggling (e.g. at 4k).

      https://i.imgur.com/OD8KA4Y.jpg

      >buy 15 trillion billion different “gsync compatible” monitors
      >mfw they all flicker
      Please recommend me a good 1440p 165hz gsync monitor that doesn’t have brightness flickering issues, please, I’m at my wits end.

      I have a Lenovo y27q-20. Same panel as LG 27gl850. No issues with latest patch. However,

      ASUS VG27AQL1A is 27" 1440p 170Hz G-SYNC Compatible, but has no issues with brightness. It also has ELMB Sync which is ULMB that is working with VRR, which makes it superior to every "hard G-SYNC" monitor.
      Just disable HDR, since it's a meme (in general and on this monitor).

      is the best option.

      • 2 years ago
        Anonymous

        >Get your eyes checked. Properly implemented VRR will make low/mid FPS feel much better, especially when your GPU is struggling (e.g. at 4k).
        Not even just low FPS. It removes added latency without causing tearing, without vsync, no matter the framerate, even at 160 FPS on a 165Hz panel, plus it smooths out framerate swings, even if it's between 100 - 140Hz, without any added latency like framerate smoothing.

        There's simply no reason not to use VRR, hence why it's in the VESA standard too now. It just started with G-Sync and FreeSync.
        The only reason fixed sync even was a thing is legacy reasons that come from CRTs (even though these days you can technically make some CRTs a VRR display).

      • 2 years ago
        Anonymous

        >especially when your GPU is struggling (e.g. at 4k).
        well I get no screen tearing whatsoever and games like gta v run at 40-60fps on my rx 580, and even with the panel at 144hz my mouse and such are buttery smooth and the game just looks normal

        i cant go back to v-sync or screen tearing, I dont know how you do it lmao

        Yeah same, I had crossfire 7950s back in the day and I had to run vsync, but since going to single GPU I have never seen screen tearing since. Maybe its an issue on a slow CPU or HDD vs SDD kinda problem.

        [...]
        t. poorgay that has never used g-sync

        I bet you don't even have a strobing backlight

        Same here. But that's because I play at forced lower fps. If I run unlimited fps then I news some sort of vrr.

        even in tf2 or csgo I don't get tearing, what games r u playing bro?

    • 2 years ago
      Anonymous

      >falling for nvidia gimmicks in the year of 2012+10

      t. poorgay that has never used g-sync

    • 2 years ago
      Anonymous

      Same here. But that's because I play at forced lower fps. If I run unlimited fps then I news some sort of vrr.

    • 2 years ago
      Anonymous

      i cant go back to v-sync or screen tearing, I dont know how you do it lmao

      • 2 years ago
        Anonymous

        >i cant go back to v-sync or screen tearing
        I could, but only because I was upgrading from a basic monitor to an OLED TV.

  3. 2 years ago
    Anonymous

    My monitor doesn't have ant VRR and I don't see any screen tearing

  4. 2 years ago
    Anonymous

    >falling for nvidia gimmicks in the year of 2012+10

    • 2 years ago
      Anonymous

      For the past 5 years all "Gsync" monitors are actually Freesync, since it requires no license fees, supports a newer DP and HDMI revision than the last Gsync chipset, and Nvidia added seamless support for it under Gsync settings.

      • 2 years ago
        Anonymous

        gsync compatible != gsync
        freesync still has limitations with refresh rates that the gsync chip doesn't have.

  5. 2 years ago
    Anonymous

    >165hz gsync
    Lol. Lmao even. GSync was never intended to go that high, that is a frick load of processing for the monitor to have to do.

  6. 2 years ago
    Anonymous

    have expensive gsync monitor (does not flicker)
    using flickerbox instead

    • 2 years ago
      Anonymous

      Is that a g15
      you have good taste sir

      • 2 years ago
        Anonymous

        the domes were dogshit though wish g815 kept the lcd

  7. 2 years ago
    Anonymous

    never had a problem out of my PG278QR

  8. 2 years ago
    Anonymous

    Hijacking your thread, sorry OP

    Realistically, how much input latency does conversion from DisplayPort to VGA add? I have an ancient LCD that only uses VGA but my graphics card doesn't have any VGA or passive convertible ports.

    • 2 years ago
      Anonymous

      buy a new montior

    • 2 years ago
      Anonymous

      remove your GPU and use the onboard graphics if your mobo has a VGA output.

    • 2 years ago
      Anonymous

      >ancient LCD
      You don't need to worry about input latency from the converter. The display already sucks.

    • 2 years ago
      Anonymous

      If it was designed by somebody competent, a few tens of microseconds at most. If not, potentially more than a frame. You could go with DVI or HDMI to VGA instead, where the obvious implementation has near zero latency, and it takes serious effort to frick it up.

    • 2 years ago
      Anonymous

      nak

    • 2 years ago
      Anonymous

      Displayport to VGA adds no latency, but your ancient VGA only LCD will have multiple 100ms of input delay

    • 2 years ago
      Anonymous

      0, it's a dac that doesn't even have a buffer.

  9. 2 years ago
    Anonymous

    What flicker are you guys talking about

    • 2 years ago
      Anonymous

      • 2 years ago
        Anonymous

        Is that not normal?

        • 2 years ago
          Anonymous

          ..no? why would you think that's normal

  10. 2 years ago
    Anonymous

    >>mfw they all flicker
    lost

  11. 2 years ago
    Anonymous

    ya i have a gsync monitor but i never use gsync. but im drinking a pepsi max right now and eating some chocolate. its pretty yummy. cand efinitely recommend it to you guys. later ill be picking up an overheating ps3 for me to delid so that should be fun. hope you guys are having a good day today cuz i know i am.

    • 2 years ago
      Anonymous

      Watch out anon, I've killed my poor PS3 while attempting a delid, bought another dead PS3 for replacement parts and it too was killed by a failed delid

  12. 2 years ago
    Anonymous

    XB271HUbmiprz

  13. 2 years ago
    Anonymous

    Newer Freesync monitors work perfectly with Gsync. Stop overpaying the israelite for a fricking label.

    • 2 years ago
      Anonymous

      I've had fs1 and 2 work perfectly for 4 years never had a gsync in my life it's all nvidia garbage

  14. 2 years ago
    Anonymous

    I've been using a Dell S2417DG for a few years and the only time I notice a flicker is during TF2 loading screens if I move the mouse, since it limits the output to like 5fps or so.
    Mind you it's a TN panel and probably not worth the full $400+ price, I got mine for $160.

    • 2 years ago
      Anonymous

      As a TN panel it has
      >100% sRGB coverage
      >no backlight bleed/IPS glow
      >the best implementation of ULMB on an LCD to this date
      >single setting dynamic overdrive with Gsync hardware
      it's literally the best gaming monitor ever made. Accept no substitutes.

      If you are interesting in fixing the gamma and banding issues, I can explain further in another post.

      • 2 years ago
        Anonymous

        Is there any color profile file for this monitor?

      • 2 years ago
        Anonymous

        >If you are interesting in fixing the gamma and banding issues, I can explain further in another post.
        I'm interested

        • 2 years ago
          Anonymous

          Buy a 10bit monitor and consume 10bit content

          • 2 years ago
            Anonymous

            one (1) videogame supports 10 bit rendering, and it's fricking alien isolation of all things
            so weird when it shouldn't even be that hard to implement and 10 bit monitors are so common now

          • 2 years ago
            Anonymous

            In video games it matters less since you produce the video data with high bit rate locally on your gpu.
            Banding looks garbage when you dl/stream 8bit low bitrate video from the internet

          • 2 years ago
            Anonymous

            >In video games it matters less since you produce the video data with high bit rate locally on your gpu.
            for this reason it arguably matters more
            both bit depth and bandwidth contribute to banding, but with games since there is no compression it's all banding
            honestly, why we are using this 10 bit half-measure when 12 bit perceptual quantization could finally solve banding for good is beyond me

          • 2 years ago
            Anonymous

            8 bit + frc already solves it so you cant see it in properly made video content

          • 2 years ago
            Anonymous

            FRC is a method of getting two extra bits from dithering on a monitor, so I'm not sure why you think it's meaningfully distinct from true 10 bit in this case.
            And no, the Barten ramp means that banding is still possible with 10 bit PQ.

          • 2 years ago
            Anonymous

            it's all bit depth* of course

      • 2 years ago
        Anonymous

        >>the best implementation of ULMB on an LCD to this date
        Better than the Viewsonics and BenQs?

  15. 2 years ago
    Anonymous

    ASUS VG27AQL1A is 27" 1440p 170Hz G-SYNC Compatible, but has no issues with brightness. It also has ELMB Sync which is ULMB that is working with VRR, which makes it superior to every "hard G-SYNC" monitor.
    Just disable HDR, since it's a meme (in general and on this monitor).

    • 2 years ago
      Anonymous

      i do not trust ASUS anymore

  16. 2 years ago
    Anonymous

    I use this one, https://www.dell.com/en-us/shop/alienware-27-gaming-monitor-aw2721d/apd/210-axsw/monitors-monitor-accessories.

    No flickering at all, the colors are beautiful and the mount is really nice.

  17. 2 years ago
    Anonymous

    Variable refresh rate is for poorgays

    • 2 years ago
      Anonymous

      wat

  18. 2 years ago
    Anonymous

    Nvidia is basically a walking antitrust

    • 2 years ago
      Anonymous

      Gsync came first, you know that, right?

      • 2 years ago
        Anonymous

        >be member of VESA
        >make vendor-locked competitor of a VESA standard before standard becomes fully implemented

        • 2 years ago
          Anonymous

          what the frick are you talking about moron

          • 2 years ago
            Anonymous

            You wouldn't get it

        • 2 years ago
          Anonymous

          Adaptive sync was simply a neat trick that resulted from develop of eDP with Displayport 1.2 spec within VESA. Nvidia saw the marketing potential of it and wanted to seize the "first mover" advantage by making a middle-ware hack before the Displayport 1.2 spec was finalized.
          The result is the now the depreciated "G-sync"
          Freesync and G-Sync compatible were just implementations of adaptive sync that came from Displayport 1.2 spec.

          • 2 years ago
            Anonymous

            Exactly

          • 2 years ago
            Anonymous

            but Gsync hardware modules works better with LCDs that require variable overdrive than Freesync/VRR does through the vesa standard. It's not just hijacking; it's literally an improvement.

          • 2 years ago
            Anonymous

            Wrong, modules are obsolete. Displayport 1.2 and newer do the exact same thing. Nvidia is quietly ditching them as they are hitting a bandwidth wall with 4K resolutions at high refresh rates and expended color spaces.
            They pretty much usurped Freesync branding with "G-Sync Compatible"

            Tearing is only obvious when your frame rates and refresh rates are too low. Tearing at 1000fps on 360Hz is barely noticeable.

            Completely and utterly wrong. Vsync was developed to eliminate screen tearing, but it comes at the cost of significantly increased input lag. Adaptive sync is far superior solution to the problem that only comes with a modest increase of input lag.

  19. 2 years ago
    Anonymous

    BenQ XL2720Z not 1440p or 165hz but no flickering. My eyes are not very sharp so 1080p @27" are no problem for me.

  20. 2 years ago
    Anonymous

    gsync is nice if you get a monitor that actually fricking works as advertised

  21. 2 years ago
    Anonymous

    Reminder that flicker (preciely timed strobing, not shitty pwm backlights) is absolutely necessary for good motion clarity

    • 2 years ago
      Anonymous

      Reminder that 60hz CRTards should be banned.

    • 2 years ago
      Anonymous

      >(preciely timed strobing, not shitty pwm backlights)
      Did you recently figure out that those are two different things or something? Who in their right might would even have the need to mention something like that unless they are moronic? Lol.
      Also frick off with your moronic off topic shit, not related to OP. Make your own thread.

      • 2 years ago
        Anonymous

        You must be new here. This website is full of anons who are quick to make dumb assumptions like that.
        Also yes, it's just a shitpost, welcome to IQfy. Lurk more before posting.

      • 2 years ago
        Anonymous

        We had an LCDgay a couple threads ago who thought persistence didn't matter and really seemed to conflate the two.

    • 2 years ago
      Anonymous

      Reminder that flicker is a workaround for inadequate framerates and also looks bad.

      • 2 years ago
        Anonymous

        1000 is the first "adequate" framerate
        Let me know when that's doable in any game.

        • 2 years ago
          Anonymous

          It's doable in minecraft

  22. 2 years ago
    Anonymous

    I have a 144Hz 1440p ultrawide panel behind my 2070 Super and it had FreeSync and works fine in G-Sync compatibility mode.

  23. 2 years ago
    Anonymous

    >compatible
    there's your problem, you have to get a monitor that solely says G-SYNC or shell out enough to buy a Certified monitor with a module.

    • 2 years ago
      Anonymous

      do they even make monitors with modules anymore? I thought it's all just based on VESA now

      • 2 years ago
        Anonymous

        they do, you probably aren't look at the expensive enough ones. It's a shame they're all horribly designed though

  24. 2 years ago
    Anonymous

    Yeah I know what you mean OP; mine flickers like crazy too. Usually at a frequency of between about one and 144 hz.

  25. 2 years ago
    Anonymous

    frick dont get me started op, i got so many problems with all this new tech too.. trying different cables, drivers, and other shit all the time. cant run 4k or hdr right half the time.. fricking flickering and even pc crashing. god damn never get the refresh rates its supposed to. why cant they get this shit to work already? oh thats right, because theyre working on the new gimmick already

  26. 2 years ago
    Anonymous

    Also used with amd and nvidia gpus fs worked perfectly
    Never used gsync and never will

  27. 2 years ago
    Anonymous

    Half of you morons have no idea what brightness flicker means and the other half are so entrenched in their AMD shill campaign that they can’t even think straight. Thanks for reminding me why I rarely come to this joke of a board.

  28. 2 years ago
    Anonymous

    I never understood the gsync free sync meme
    Just get a monitor with high refresh rate and you will never have to turn vsync on

    • 2 years ago
      Anonymous

      Screen-tearing is painfully obvious once you have seen and used adaptive sync in action and going back to a non adaptive sync monitor.

      • 2 years ago
        Anonymous

        Tearing is only obvious when your frame rates and refresh rates are too low. Tearing at 1000fps on 360Hz is barely noticeable.

  29. 2 years ago
    Anonymous

    DELL S2417DG

    ACCEPT NO SUBSTITUTES

  30. 2 years ago
    Anonymous

    I use a Gigabyte M27Q with a 6800 XT. Freesync works perfectly, and the monitor has a great overdrive setting that works for the entire refresh rate.

  31. 2 years ago
    Anonymous

    this is literally what gsync does lmao, just don't fricking use it moron, or make sure your framerare never drops too low because that's what causes it

    • 2 years ago
      Anonymous

      i love how confidently incorrect you dumb IQfy Black folk are

      • 2 years ago
        Anonymous

        seethe more homosexual you just didn't waste your money on a goy(money)sink monitor like i did

  32. 2 years ago
    Anonymous

    Never used VRR never will. Sounds cringe.

    • 2 years ago
      Anonymous

      This. Just get a monitor with such a high refresh rate that you can't see screen tearing.

      • 2 years ago
        Anonymous

        you are all fricking moronic it's actually unreal. why are you posting shit advice when you clearly don't know what you're talking about?

        • 2 years ago
          Anonymous

          You've literally never used an adaptive sync monitor. They're all flickering pieces of shit that people need to stop paying money for and if you disagree it's probably because you're selling them.

        • 2 years ago
          Anonymous

          Enjoy you're input lag

          • 2 years ago
            Anonymous

            if you cap your framerate 2-3fps below your monitor's max refresh rate, and use driver enforced vsync with gsync enabled, you get lower input lag than no vsync at all.

          • 2 years ago
            Anonymous

            Your wrong

          • 2 years ago
            Anonymous

            I'm not wrong. It's measurable and true.

            Wrong, modules are obsolete. Displayport 1.2 and newer do the exact same thing. Nvidia is quietly ditching them as they are hitting a bandwidth wall with 4K resolutions at high refresh rates and expended color spaces.
            They pretty much usurped Freesync branding with "G-Sync Compatible"
            [...]
            Completely and utterly wrong. Vsync was developed to eliminate screen tearing, but it comes at the cost of significantly increased input lag. Adaptive sync is far superior solution to the problem that only comes with a modest increase of input lag.

            I'm not wrong. Modules work better because they enforce variable overdrive whereas vesa does not enforce it. Module isn't necessary for OLED because it's instantaneous response doesn't require overdrive, but for VA, IPS, or TN panels a module will always be better than Vesa standard only.

      • 2 years ago
        Anonymous

        That's not how that works. You'll definitely still see tearing all over the place, just less of it, with a higher refresh rate. Of course you still want the higher refresh rate no matter what. But, you can also control the area of where tearing occurs by capping the framerate with RTSS or in-game. Capping the framerate will at least manage the tearing in one certain area.

      • 2 years ago
        Anonymous

        no such thing. You can even see tearing at 240hz.

  33. 2 years ago
    Anonymous

    imagine not

  34. 2 years ago
    Anonymous

    AW2721D has the best g-sync I've used out of 30 or so monitors, that said, still returned it since IPS hurts my wimpy eyes

    • 2 years ago
      Anonymous

      How does it hurt your eyes?

      • 2 years ago
        Anonymous

        I just don't like IPS panels, there's a painful kind of graininess to it, even with the backlight low.

        • 2 years ago
          Anonymous

          >look up ips panel grainy
          >mfw everyone is complaining
          oh for fricks sake, is there ANY monitor that doesn’t have a downside? why are modern monitors so fricking trash????

          • 2 years ago
            Anonymous

            Because they’re all manufactured by the same chinks

          • 2 years ago
            Anonymous

            Just have to pick the monitor type that fits best for what you want to use it for. I use a VA mostly which means nice looking content but losing at FPS.

          • 2 years ago
            Anonymous

            Ya, a OLED TV, although you will have to disable resolution scaling AND deal with a rather low PPI. Otherwise its good all across the board in picture quality as well as motion clarity.

  35. 2 years ago
    Anonymous

    Anyone with a VG27AQL1A? What’s your experience with it?

    • 2 years ago
      Anonymous

      Anyone? And where’s a good place to buy one? Amazon sucks.

  36. 2 years ago
    Anonymous

    X34 GS has no flicker

  37. 2 years ago
    Anonymous

    Life makes much more sense when you pretend monitor industry doesn't exist and you buy an OLED TV and use that as a pc display sitting couple meters away.

  38. 2 years ago
    Anonymous

    For me it's 1080p 24" 240hz. Nothing beats it

  39. 2 years ago
    Anonymous

    I bought the LG 32GP850 for $375 on Christmas when it was on sale. Look up the reviews and specs yourself. Highly recommend if you can get it at the right price. Never seen it flicker with my 3060 ti.

    • 2 years ago
      Anonymous

      >32GP850
      actually right now it's 396 on Amazon. Good deal I would say, if you're not broke.

Your email address will not be published. Required fields are marked *