inherit
1817
0
8,410
Kappa Neko
...lives for biotic explosions. And cheesecake!
3,384
Oct 18, 2016 21:17:18 GMT
October 2016
kappaneko
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Mass Effect Andromeda
|
Post by Kappa Neko on Nov 21, 2023 11:52:30 GMT
Second, do the textures only look blurry with DLSS enabled? If so, you can try setting DLSS to "Quality" mode instead. If the textures still look too blurry on Quality mode, try turning off DLSS completely. I'm already on Quality. It looks the same when I turn DLSS off or just frame generation, I think the textured aren't great and I've been spoiled by modded 4K textures in Bethesda games. But even vanilla Starfield has much sharper textures. But then again I was disappointed with how downgraded W3 looked from the infamous E3 trailer. I guess I'm just a graphics whore, lol. My visual enjoyment is definitely a bit tainted by the grainy and outdated looking textures. Yes, film grain is turned off. So while the lighting is next gen, the canvas so to speak looks dated so the effect is kind of weird to me. Also, after everyone shitting on Starfield faces and cherry picking Cyberpunk faces I have to say Starfield faces have better skin textures. Again, Cyberpunk is grainy and not very detailed there. And Cyberpunk's random NPC faces don't really look amazing either. The problem with Starfield is that everything is highly depended on the lighting which ranges from great to atrocious. In good lighting Starfield faces look like modded Skyrim faces to me. 99% of my issues with Starfield are not about visuals. Colors and lighting is great with RT and path tracing in the city, agreed. Starfield in comparison looks super washed out. The desert I started out in as a nomad has tons of shadow glitches though and made me think of Bethesda games, lol. Well, obviously I'd rather play a good story with meh textures than the other way round. So far I'm loving Jackie and V's relationship. I've only just returned to the apartment. So I haven't seen shit and not done any actual combat. So can't say much yet about the writing. Presentation is of course miles ahead of Starfield. Any Nintendo game has better camera work than Starfield, lol.
|
|
inherit
1093
0
1,467
bmwcrazy
1,088
August 2016
bmwcrazy
|
Post by bmwcrazy on Nov 21, 2023 17:22:05 GMT
I'm already on Quality. It looks the same when I turn DLSS off or just frame generation, I think the textured aren't great and I've been spoiled by modded 4K textures in Bethesda games. But even vanilla Starfield has much sharper textures. But then again I was disappointed with how downgraded W3 looked from the infamous E3 trailer. I guess I'm just a graphics whore, lol. My visual enjoyment is definitely a bit tainted by the grainy and outdated looking textures. Yes, film grain is turned off. So while the lighting is next gen, the canvas so to speak looks dated so the effect is kind of weird to me. It's also the reason why Cyberpunk 2077 runs pretty well on GPU with only 8GB of VRAM even with ray tracing enabled. For me, Starfield felt like an outdated 10-year old Bethesda game with a fresh coat of paint. While Cyberpunk 2077 felt like a proper modern game. I could spend a couple hours marveling the level of detail in Starfield, but after that everything else was just too clinical and boring for me. I had spent thousands of hours in games like Morrowind, Oblivion, Skyrim, Fallout 3, New Vegas, and Fallout 4. Unfortunately that Bethesda magic was definitely missing in Starfield.
|
|
inherit
9105
0
Aug 11, 2017 18:04:01 GMT
8,817
slimgrin727
I don't stir, I work the material.
3,636
Jul 28, 2017 17:05:24 GMT
July 2017
slimgrin727
|
Post by slimgrin727 on Nov 21, 2023 17:24:17 GMT
Cyberpunk is generally considered to be one of the best looking games on the market so if you're getting blurry textures, something is wrong. CDPR keep future proofing the game by working with Nvidia and adding new tech.
|
|
inherit
1817
0
8,410
Kappa Neko
...lives for biotic explosions. And cheesecake!
3,384
Oct 18, 2016 21:17:18 GMT
October 2016
kappaneko
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Mass Effect Andromeda
|
Post by Kappa Neko on Nov 21, 2023 18:20:09 GMT
For me, Starfield felt like an outdated 10-year old Bethesda game with a fresh coat of paint. While Cyberpunk 2077 felt like a proper modern game. I could spend a couple hours marveling the level of detail in Starfield, but after that everything else was just too clinical and boring for me. I had spent thousands of hours in games like Morrowind, Oblivion, Skyrim, Fallout 3, New Vegas, and Fallout 4. Unfortunately that Bethesda magic was definitely missing in Starfield. Totally agree with you there! I'll take screenshots of how my game looks and then you tell me if this is normal or not. Maybe I'm just nitpicky.
|
|
inherit
1817
0
8,410
Kappa Neko
...lives for biotic explosions. And cheesecake!
3,384
Oct 18, 2016 21:17:18 GMT
October 2016
kappaneko
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Mass Effect Andromeda
|
Post by Kappa Neko on Nov 21, 2023 19:48:45 GMT
OK guys, it's partially DLSS, it looks BAD, I just realized. It puts a shimmery vasiline coat on NPCs. It looks disgusting even on Quality settings. Whoever claims DLSS looks almost as good as native is either a liar or visually impaired... But without it I get 34 frames walking around the city. The lag is sooo bad, unplayable this way. How is anybody supposed to run path tracing when even the 4090 can't handle it?? Now if I disable path tracing I get about 50 frames native, doable. And it looks much better than trying to run path tracing with DLSS. However, path tracing is REALLY nice. So now I'm sad. I google this briefly and apparently DLSS didn't look this shit before the 2.0 update? Can anybody confirm this? Are they working on a fix? Right now I'd rather play without path tracing than endure the horrific DLSS blur. That said, texture quality of terrian and walls and such I was talking about before is not badly affected by DLSS. It just doesn't look too great for me. example of the desert terrain with DLSS: Terrain cracks are pretty pixelated. Also notice the glitched out textures of what I think is supposed to be some rocks. Here's a glitched neon sign, happens with and without DLSS: A few DLSS with frame generation pics that look OK: Here you can see how horrible faces look with DLSS: Especially the dude in the background, PS3 graphics: This is without DLSS, MUCH better, shimmer is gone: DLSS: No DLSS: So how are you guys running path tracing without DLSS? Is it possible at all by tweaking the settings? I tried dialing a few things back but it made no difference whatsoever to my frame rate. I'm playing at 3440x1440 ultra wide, so 3K. I guess I won't be enjoying path tracing after all. *cries*
|
|
inherit
1093
0
1,467
bmwcrazy
1,088
August 2016
bmwcrazy
|
Post by bmwcrazy on Nov 21, 2023 23:05:58 GMT
I google this briefly and apparently DLSS didn't look this shit before the 2.0 update? Can anybody confirm this? Are they working on a fix? Right now I'd rather play without path tracing than endure the horrific DLSS blur. That's actually not caused by DLSS. It's how the game handles the textures. The game streams textures at various resolutions based on your distance. A lot of textures are very low-res from afar and only sharpen up when you walk up and zoom in on it. DLSS lets the game render at a lower resolution than your monitor's native resolution, and then upscales the image by using AI. The upscaling process runs through a deep learning based reconstruction algorithm and reconstructs the image by adding more detail. It is a great way to gain more performance without increasing more input latency. It also replaces the traditional anti-aliasing methods. So again, DLSS doesn't decide how the game streams the textures. Those blurry textures in your screenshots are mostly done by the game itself. DLSS 3.0 aka Frame Generation is something completely different. It's motion interpolation and not image upscaling. So it adds a new frame between each real frame, but it also increases the input latency. It's very similar to the input lag that you get when you game on your modern TV with motion interpolation enabled, or if you forget to turn on game mode for TV. Also, the low-res textures from the ads is a bug that has been in the game for quite a while. It's a bit distracting and it really breaks the immersion for me. I hope they fix it soon. So how are you guys running path tracing without DLSS? Is it possible at all by tweaking the settings? I tried dialing a few things back but it made no difference whatsoever to my frame rate. I'm playing at 3440x1440 ultra wide, so 3K. I guess I won't be enjoying path tracing after all. *cries* I don't. I use "Psycho" ray tracing settings instead. The game looks less realistic compared to path tracing, but it's brighter, more colorful, and performs way better. I play Cyberpunk on a 1440p 144hz monitor so I can basically play through the entire game at 138 FPS without framerate drops.
|
|
inherit
1817
0
8,410
Kappa Neko
...lives for biotic explosions. And cheesecake!
3,384
Oct 18, 2016 21:17:18 GMT
October 2016
kappaneko
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Mass Effect Andromeda
|
Post by Kappa Neko on Nov 22, 2023 9:37:57 GMT
I play Cyberpunk on a 1440p 144hz monitor so I can basically play through the entire game at 138 FPS without framerate drops. How do you get that high a frame rate?? My monitor should have 30% more pixels but I only get around 50-60fps on Psycho without path tracing, which is much more than 30% off. Hmmm. DLSS looks so gross and path tracing so good that I might just slog through non combat situations with 28-35fps native and only turn DLSS on for combat... Not how imagined my Cyberpunk experience would go with a 4090, lol.
|
|
inherit
1093
0
1,467
bmwcrazy
1,088
August 2016
bmwcrazy
|
Post by bmwcrazy on Nov 22, 2023 17:39:33 GMT
How do you get that high a frame rate?? My monitor should have 30% more pixels but I only get around 50-60fps on Psycho without path tracing, which is much more than 30% off. Hmmm. DLSS looks so gross and path tracing so good that I might just slog through non combat situations with 28-35fps native and only turn DLSS on for combat... Not how imagined my Cyberpunk experience would go with a 4090, lol. I'm using DLSS Quality, Frame Generation, Psycho Ray Traced Lighting, Ray Traced Shadows, Ray Traced Reflections, and everything else set to max. You can try the same settings and see if the framerate improves, but 50-60 FPS is quite low. Without Frame Generation, I can still play the game at 90-120 FPS most of the time with occasional framerate drops down to 60 FPS in really busy areas, like the market by Tom's Diner that's near V's first apartment.
|
|
inherit
1817
0
8,410
Kappa Neko
...lives for biotic explosions. And cheesecake!
3,384
Oct 18, 2016 21:17:18 GMT
October 2016
kappaneko
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Mass Effect Andromeda
|
Post by Kappa Neko on Nov 22, 2023 19:04:13 GMT
How do you get that high a frame rate?? My monitor should have 30% more pixels but I only get around 50-60fps on Psycho without path tracing, which is much more than 30% off. Hmmm. DLSS looks so gross and path tracing so good that I might just slog through non combat situations with 28-35fps native and only turn DLSS on for combat... Not how imagined my Cyberpunk experience would go with a 4090, lol. I'm using DLSS Quality, Frame Generation, Psycho Ray Traced Lighting, Ray Traced Shadows, Ray Traced Reflections, and everything else set to max. You can try the same settings and see if the framerate improves, but 50-60 FPS is quite low. Without Frame Generation, I can still play the game at 90-120 FPS most of the time with occasional framerate drops down to 60 FPS in really busy areas, like the market by Tom's Diner that's near V's first apartment. I thought you said you're not using DLSS. Misunderstood you there. My bad. I too had those frames with DLSS and frame generation while using path tracing as well. I figured out the best solution: frame generation only. No shimmering faces, it looks pretty much like native except my frame rate doubled to 55-65. Which is good enough for me. Looks great that way with path tracing. Not 100% smooth of course but I'm happy with it. Everything is crisp now. Unless I'm looking at occasional super low res textures that seem an unavoidable 'issue' unless I mod in better textures So frame generation is the real magic while the upscaling tech looks like hot garbage to me. At least in this game. I can finally start to get properly immersed in the game!
|
|
inherit
1093
0
1,467
bmwcrazy
1,088
August 2016
bmwcrazy
|
Post by bmwcrazy on Nov 22, 2023 19:33:04 GMT
I thought you said you're not using DLSS. Misunderstood you there. My bad. I too had those frames with DLSS and frame generation while using path tracing as well. I figured out the best solution: frame generation only. No shimmering faces, it looks pretty much like native except my frame rate doubled to 55-65. Which is good enough for me. Looks great that way with path tracing. Not 100% smooth of course but I'm happy with it. Everything is crisp now. Unless I'm looking at occasional super low res textures that seem an unavoidable 'issue' unless I mod in better textures So frame generation is the real magic while the upscaling tech looks like hot garbage to me. At least in this game. I can finally start to get properly immersed in the game! Do you mean 55-65 FPS with Frame Generation? If so, your actual framerate is only at 28-35 FPS. That's a bit too low for Cyberpunk 2077 with a system like yours. You might have trouble with driving and aiming with that kind of input lag. Are you sure you don't have other settings enabled like DLAA? Because DLAA will definitely kill the performance. You should try to aim for 90-100 FPS (without Frame Generation). For me, even driving and aiming my weapons at 50-60 FPS is a pain. Unfortunately, NPC faces look much more realistic with path-traced shadows and lighting. If you must use path-tracing, you should also turn on Ray Reconstruction. That will improve the image quality and also gain a tiny bit of performance.
|
|
inherit
9105
0
Aug 11, 2017 18:04:01 GMT
8,817
slimgrin727
I don't stir, I work the material.
3,636
Jul 28, 2017 17:05:24 GMT
July 2017
slimgrin727
|
Post by slimgrin727 on Nov 22, 2023 19:33:45 GMT
On my older rig I just put everything on Ultra and no RTX for me, which gave me around 70 fps. After 2.0 though, it's gotten more demanding and I have to lower a few settings. But I'd rather do that than have a fluctuating frame rate below 60, which drives me nuts. As for DLSS, I don't use it. textures look fine to me and pop in is minimal even when driving. Of course the random NPCs don't look nearly as detailed as the main characters.
Here's a pretty good guide:
|
|
inherit
1817
0
8,410
Kappa Neko
...lives for biotic explosions. And cheesecake!
3,384
Oct 18, 2016 21:17:18 GMT
October 2016
kappaneko
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Mass Effect Andromeda
|
Post by Kappa Neko on Nov 22, 2023 20:53:00 GMT
Do you mean 55-65 FPS with Frame Generation? If so, your actual framerate is only at 28-35 FPS. That's a bit too low for Cyberpunk 2077 with a system like yours. You might have trouble with driving and aiming with that kind of input lag. Are you sure you don't have other settings enabled like DLAA? Because DLAA will definitely kill the performance. You should try to aim for 90-100 FPS (without Frame Generation). For me, even driving and aiming my weapons at 50-60 FPS is a pain. Unfortunately, NPC faces look much more realistic with path-traced shadows and lighting. If you must use path-tracing, you should also turn on Ray Reconstruction. That will improve the image quality and also gain a tiny bit of performance. Yes but that's with path tracing enabled so that's normal, no? No DLAA enabled but everything else is maxed out. I think 50-60 is fiiiine. I mean it's not great, Starfield runs so much better, it's ridiculous. It's the first game where I felt the new rig. Butter smooth even locked at 60fps, like nothing I ever experienced before. Felt amazing. But I can tolerate a lot from growing up on 30fps on consoles. For years anything above 30 was fine. I will always choose pretty over performance because the visuals are a big part of my enjoyment of a game. I will not use DLSS, it looks awful to me even on Quality. PS3 faces. I want nothing to do with it. Kills immersion for me. Just finished the first big mission and combat is fine. I play with controller anyway and can't aim for shit in any game. Driving is just awful, period. I hate driving in games and here it's especially bad because I can't see shit. Not like driving my real car at all. Extremely disorienting. Gameplay-wise this game is not my cup of tea at all as I feared but I hope the narrative will be worth it.
|
|
inherit
1093
0
1,467
bmwcrazy
1,088
August 2016
bmwcrazy
|
Post by bmwcrazy on Nov 22, 2023 21:28:03 GMT
Yes but that's with path tracing enabled so that's normal, no? No DLAA enabled but everything else is maxed out. With your system, I would expect to get at least 40-60 FPS (Path-tracing without DLSS), or 80-100+ FPS with Frame Generation. Those are the numbers I get with my smaller 1440p monitor but I also have a slower CPU (Ryzen 9 5950X). I think 50-60 is fiiiine. I mean it's not great, Starfield runs so much better, it's ridiculous. It's the first game where I felt the new rig. Butter smooth even locked at 60fps, like nothing For me it's the opposite. Cyberpunk 2077 runs so much better than Starfield for me. In Akila City, my framerate would drop down to 60 FPS and it really feels sluggish. New Atlantis also looks terrible for a game released in 2023. Starfield is very CPU intensive and it still needs a lot of optimization done. Also you need to start using G-Sync ASAP. Starfield shouldn't be locked at 60 FPS with your 175hz monitor. You should disable the in-game Vsync and use Nvidia's G-Sync instead with your monitor. But I can tolerate a lot from growing up on 30fps on consoles. For years anything above 30 was fine. I will always choose pretty over performance because the visuals are a big part of my enjoyment of a game. I will not use DLSS, it looks awful to me even on Quality. PS3 faces. I want nothing to do with it. Kills immersion for me. Just finished the first big mission and combat is fine. I play with controller anyway and can't aim for shit in any game. Driving is just awful, period. I hate driving in games and here it's especially bad because I can't see shit. Not like driving my real car at all. Extremely disorienting. Gameplay-wise this game is not my cup of tea at all as I feared but I hope the narrative will be worth it. Haha, I need high framerate to aim and drive properly. So far I have delivered every car to El Captain with full bonuses. I play a lot of first person and third person shooter games, so every shot I make I try to land a headshot. 30 FPS is nothing but a slide show for me. That's why consoles need motion blur to hide the stutters that you get at 30 FPS.
|
|
inherit
1744
0
Apr 30, 2024 18:00:38 GMT
623
dagless
333
October 2016
dagless
|
Post by dagless on Nov 22, 2023 22:46:35 GMT
Just tried the benchmark from graphics menu. For default settings (no frame generation business) I got.
Ray tracing low:72fps average Ray tracing medium: 52 fps average Ray tracing medium (with RT reflections on): 49 fps average Path raytracing medium with reflections: 27 fps average
That’s on a measly 3060 Ti, i7 12700k, 32 GB ram
Steam seems to concur with those numbers. I generally played on raytracing low which sets most other things to high or ultra. Didn’t mess around with much else.
I’d expect a fair bit better from a 4090, unless you have a particularly crappy CPU or motherboard, or your cooling is inadequate?
Even without raytracing it’s a beautiful game. The level of detail is amazing. I’m not talking about texture resolution, but the design of the city from the large scale buildings to street level and indoor environments.
Rarely saw the problem where the game loads a low texture, but I have seen it.
As for what you said about NPCs on the starfield thread, you can try to talk to them, but they’ll usually just tell you to F off or something equally abrupt. Not completely unreasonable for the setting, and not all that different to talking to unnamed NPCs in starfield, only without telling you some random little fact about their life. You can’t get a quest or any lore by walking up to randoms, but there’s clearly been a lot of work into giving them a bit of life. As well as just walking (or staggering) around, they’ll be eating, drinking, smoking, tripping, passed out, dancing, having a fit, looking at their phones (still a thing apparently) etc. Occasionally you’ll overhear a conversation or argument. When I loaded it up to do benchmarking, I saw a guy sitting on an unremarkable street playing a steel pan and woman watching him and occasionally clapping. Haven’t seen that before and it was good enough to stop and listen for a bit (also the animation was in time). But that kind of thing is very restrained. You might happen across it or not.
First time I played, I was comparing to other games or my own expectations a lot, and I kept seeing things where I thought it was lacking. Second time round I appreciated a lot more of the things it does really well.
|
|
inherit
688
0
1,912
UutIVvdPw7END0Ef
1,517
August 2016
uutivvdpw7end0ef
Bottom
|
Post by UutIVvdPw7END0Ef on Nov 23, 2023 1:11:16 GMT
|
|
inherit
1817
0
8,410
Kappa Neko
...lives for biotic explosions. And cheesecake!
3,384
Oct 18, 2016 21:17:18 GMT
October 2016
kappaneko
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Mass Effect Andromeda
|
Post by Kappa Neko on Nov 23, 2023 7:33:58 GMT
Also you need to start using G-Sync ASAP. Starfield shouldn't be locked at 60 FPS with your 175hz monitor. You should disable the in-game Vsync and use Nvidia's G-Sync instead with your monitor. My monitor doesn't support g-sync... I double checked in the settings. What's the big deal anyway? Never had a g-sync monitor, much too expensive in the past. I'll check for thermal throttling but from my quick search on path tracing benchmarks for the 4090 my frame rate seems to be in line for maxed out everything. I'll keep playing this way. I'm satisfied now. As for Starfield, as I wrote earlier, I didn't realize windows put my refresh rate at 59 instead of 165 that's why vsync got me 60fps in Starfield. But even that "low"number was super snappy fast to me. Yeah, did have some dips below 60 in cities sometimes. But compared to FO4 which was super sluggish, Starfield plays great. As for what you said about NPCs on the starfield thread, you can try to talk to them, but they’ll usually just tell you to F off or something equally abrupt. Not completely unreasonable for the setting, and not all that different to talking to unnamed NPCs in starfield, only without telling you some random little fact about their life. Yeah, I realized that you have to be VERY close to NPCs, basically licking their face to get the "Talk" prompt. Either I wasn't close enough walking around the desert junk town or it was disabled for the intro. No idea. Anyway, my bad.
|
|
inherit
1093
0
1,467
bmwcrazy
1,088
August 2016
bmwcrazy
|
Post by bmwcrazy on Nov 23, 2023 16:58:14 GMT
My monitor doesn't support g-sync... I double checked in the settings. What's the big deal anyway? Never had a g-sync monitor, much too expensive in the past. AW3423DW is a G-Sync Ulimate monitor. That means it has an actual G-Sync module. So you have already bought a "too expensive" G-Sync monitor and you might as well use the feature that you paid for. The traditional Vsync only works at a fixed refresh rate. For example, your monitor has a 175hz refresh rate. That means if you have Vsync enabled, it will automatically limit the framerate at 175 FPS to reduce screen tearing. It's all fine and dandy when your game can reach 175 FPS. However if the framerate drops below 175 FPS, Vsync needs to reduce the framerate to a fraction of your monitor's refresh rate in order to match it. That means if your game drops to 170 FPS, Vsync automatically lowers it to 87.5 FPS to match the 175hz refreshrate. If the framerate drops below 87.5 FPS, let's day 70 FPS, it now needs to drops down to the next level by dividing it by three instead and you get 175/3=58.333 (FPS) instead. Just imagine you're playing a game and your framerate is fluctuating between 175, 87.5 (175/2), 58.33 (175/3), AND 43.75 (175/4) FPS. It's very distracting to play the game with these huge framerate drops, right? So this is why we use Nvidia's G-Sync and AMD's Freesync instead. G-Sync is basically dynamic refresh rate. It changes the refresh rate according to your current framerate. So G-Sync always works regardless of what FPS you're getting. If your framerate is at 170, it drops your monitor's refresh rate down to 170hz instead to match it. With traditional Vsync, it would need to divide that by half instead. It's a pretty neat feature, am I right? Now you just need to enable it in the Nvidia Control Panel and you can finally experience the wonder of variable refresh rate yourself. Happy Thanksgiving.
|
|
inherit
1817
0
8,410
Kappa Neko
...lives for biotic explosions. And cheesecake!
3,384
Oct 18, 2016 21:17:18 GMT
October 2016
kappaneko
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Mass Effect Andromeda
|
Post by Kappa Neko on Nov 23, 2023 18:22:32 GMT
AW3423DW is a G-Sync Ulimate monitor. That means it has an actual G-Sync module. Nope, I have the DWF model, it doesn't have G-sync, and the NVCP confirms it too. That's why it's cheaper but supposedly is otherwise the better looking newer model. But thank you anyway for educating me and trying to help. Appreciated! And happy Thanksgiving! We don't celebrate it here but enjoy your long weekend.
|
|
inherit
1093
0
1,467
bmwcrazy
1,088
August 2016
bmwcrazy
|
Post by bmwcrazy on Nov 23, 2023 20:04:30 GMT
Nope, I have the DWF model, it doesn't have G-sync, and the NVCP confirms it too. That's why it's cheaper but supposedly is otherwise the better looking newer model. But thank you anyway for educating me and trying to help. Appreciated! Oh man, I thought you had the 175hz AW3423DW. The AW3423DWF is 165hz, but it is still compatible with G-Sync. So you can still enable G-Sync in the Nvidia Control Panel. Give it a shot. You might need to use a Displayport cable. For HDMI to work, you'll need to use an HDMI cable that's compatible with HDMI 2.1.
|
|
inherit
9105
0
Aug 11, 2017 18:04:01 GMT
8,817
slimgrin727
I don't stir, I work the material.
3,636
Jul 28, 2017 17:05:24 GMT
July 2017
slimgrin727
|
Post by slimgrin727 on Nov 23, 2023 20:07:22 GMT
No English subs...
|
|
inherit
1817
0
8,410
Kappa Neko
...lives for biotic explosions. And cheesecake!
3,384
Oct 18, 2016 21:17:18 GMT
October 2016
kappaneko
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Mass Effect Andromeda
|
Post by Kappa Neko on Nov 23, 2023 20:47:49 GMT
Oh man, I thought you had the 175hz AW3423DW. The AW3423DWF is 165hz, but it is still compatible with G-Sync. So you can still enable G-Sync in the Nvidia Control Panel. Give it a shot. You might need to use a Displayport cable. For HDMI to work, you'll need to use an HDMI cable that's compatible with HDMI 2.1. Oh, OK. I assumed because the monitor doesn't have native G-sync and it says it's not supported in the control panel it wouldn't do anything if I forced it. How odd... I've enabled it. Will it also feel better with my shit fake frame generation frame rate? I'm going to bed now. I'll test it tomorrow. Hopefully there'll be a difference and I can recognize it too, lol. Thanks again!
|
|
inherit
1093
0
1,467
bmwcrazy
1,088
August 2016
bmwcrazy
|
Post by bmwcrazy on Nov 24, 2023 5:46:36 GMT
Oh, OK. I assumed because the monitor doesn't have native G-sync and it says it's not supported in the control panel it wouldn't do anything if I forced it. How odd... I've enabled it. Will it also feel better with my shit fake frame generation frame rate? I'm going to bed now. I'll test it tomorrow. Hopefully there'll be a difference and I can recognize it too, lol. Thanks again! This discussion is going a bit off topic here. I'll try my best to do a quick explanation. To properly use G-Sync, first you need to manually limit the framerate for your games. Your monitor has a 165hz refresh rate, so you need to cap your framerate a couple FPS lower than the refresh rate like 160 FPS. This way it eliminates the slight delay that you get when the framerate goes over the refresh rate. Because G-Sync only handles framerates that are lower than the refresh rate. When the framerate goes over the refresh rate, it works like the traditional Vsync which results in more input lag. So you don't want your framerate to go over 165 FPS when you have G-Sync enabled. You can limit the framerate through Nvidia Control Panel either globally or set it individually for every game that you have installed. So just enable G-Sync through Nvidia Control Panel, cap the framerate at something like 160 FPS also through Nvidia Control Panel, and disable in-game Vsync for all your games. Also when you use Frame Generation in games like Cyberpunk 2077, it automatically enables Nvidia Reflex which is used to reduce input lag. Nvidia Reflex also automatically caps the framerate lower than the refresh rate. You paid a fortune for your gaming PC and monitor, so you might as well make the most of them especially "premium" features like G-Sync. For example, this is how I set up my Cyberpunk 2077 with G-Sync and Frame Generation. Enable G-Sync for your G-Sync compatible monitor. Make sure the framerate is capped. I have a 144hz monitor so I cap mine at 140 FPS. Which is a bit redundant because Nvidia Reflex already does it when I have Frame Generation, but it's still a good practice because it keeps G-Sync working when Frame Generation is disabled. Personally, I have the framerate limited globally instead of per game basis. Now remember Frame Generation in Cyberpunk 2077 disables G-Sync. So to eliminate screen tearing, I actually need to enable Vsync through Nvidia's Control Panel instead. Finally, make sure Cyberpunk 2077's in-game Vsync is disabled. Just in case. That's it and enjoy.
|
|
inherit
1744
0
Apr 30, 2024 18:00:38 GMT
623
dagless
333
October 2016
dagless
|
Post by dagless on Nov 24, 2023 21:57:57 GMT
Well, I greatly appreciate the off topic diversion. Prompted me to double check my own settings.
Turns out that everything was set up fine in Nvidia control panel and the game, but not enabled on the monitor. Weirdly, even though monitor is billed a G sync compatible and not Freesync, the option on the monitor is called Freesync. Maybe they just use the same menu as the other models they sell?
Since I didn’t use V sync, it’s made no difference to my frame rates, but it does make the 45-60 fps I was getting with medium raytracing settings and even a couple of extra options look smoother and now basically fine for me even in frantic shootouts.
Now I’m wondering if any other games I’d written off for raytracing might actually be playable.
|
|
inherit
688
0
1,912
UutIVvdPw7END0Ef
1,517
August 2016
uutivvdpw7end0ef
Bottom
|
Post by UutIVvdPw7END0Ef on Dec 1, 2023 17:15:54 GMT
|
|
inherit
ღ Aerial Flybys
61
0
1
26,258
Obsidian Gryphon
10,159
August 2016
obsidiangryphon
ObsidianGryphon
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Baldur's Gate, Neverwinter Nights, Jade Empire
|
Post by Obsidian Gryphon on Dec 3, 2023 8:27:49 GMT
|
|