And film grain. Get that fake static out of here
Depth of field and chromatic aberration are pretty cool if done right.
Depth of field is a really important framing tool for photography and film. The same applies to games in that sense. If you have cinematics/cutscenes in your games, they prob utilize depth of field in some sense. Action and dialogue scenes usually emphasize the characters, in which a narrow depth of field can be used to put focus towards just the characters. Meanwhile things like discovering a new region puts emphasis on the landscape, meaning they can use a large depth of field (no background blur essentially)
Chromatic aberration is cool if done right. It makes a little bit of an out of place feel to things, which makes sense in certain games and not so much in others. Signalis and dredge are a few games which chromatic aberration adds to the artstyle imo. Though obviously if it hurts your eyes then it still plays just as fine without it on.
I feel like depth of field and motion blur have their place, yeah. I worked on a horror game one time, and we used a dynamic depth of field- anything you were looking at was in focus, but things nearer/farther than that were slightly blurred out, and when you moved where you were looking, it would take a moment (less than half a second) to ‘refocus’ if it was a different distance from the previous thing. Combined with light motion blur, it created a very subtle effect that ratcheted up anxiety when poking around. When combined with objects in the game being capable of casting non-euclidean shadows for things you aren’t looking at, it created a very pervasive unsettling feeling.
raytracing’s the cool kid, keep him in
I always turn that shit off. Especially bad when it’s a first-person game, as if your eyes were a camera.
I don’t mind a bit of lens flare, and I like depth of field in dialog interactions. But motion blur and chromatic aberration can fuck right off.
I mind lens flare a lot because I am not playing as a camera and real eyes don’t get lens flares.
That’s fair. I usually turn it off for FPS games. But if it’s mild, I leave it on for third person games where I am playing as a camera.
Same same
Shadows: Off
Polygons: Low
Idle Animation: Off
Draw distance: Low
Billboards instead of models for scenery items: OnAlt: F4
Launch: BalatroI think my PC can run the C64 demake of Balatro in an emulator
The main problem with these is giving people control of these properties without them knowing how the cameras work in real life.
The problem is that I am not playing as a camera, so why the hell would I want my in-game vision to emulate one?
Sometimes it does look better, but I would argue it’s on the developer to pick the right moments to use them, just like a photographer would. Handing it to the players is the wrong way to go about it, their control on it isnt nearly as good, even without considering their knowledge about it.
PS3-> everything is sepia filtered and bloomed until nearly unplayable.
I will say that a well executed motion blur is just a chef’s kiss type deal, but it’s hard to get right and easy to fuck up
PS3-> everything is sepia filtered and bloomed until nearly unplayable.
That’s just games from that period. It’s not excluse to PS3.
The number of times I’ve broken this one out…
After having lived through it, if I never play a gritty brown bloom game again, it’ll be too soon.
Man, VGCats. Deep, deep, deep cut
I think of that comic every time I see a gritty brown game. I don’t see bloom as much any more, though.
I think maybe that’s part of why The Last Of Us grabbed everyone so hard; it was a gritty, green game. STALKER 2 is brown AF, though. Thank God they skipped the whole bloom fad.
I think bloom is one of those things that when it’s used right it brings the atmosphere together without sticking out as a thing that’s going on, like how our eyes adjust to light changes. When it’s out of control and blacks out the scene by going WAAAAY too bright it sucks because you’re looking at bloom, not at the game.
Personally I use motion blur in every racing game I can but nothing else. It helps with the sense of speed and smoothness.
Early HDR games were rough. I look back at Zelda Twilight Princess screenshots, and while I really like that game, I almost squint looking at it because it’s so bloomed out.
Now… in fairness…
Chromatic abberation and lense flares, whether you do or don’t appreciate how they look (imo they arguably make sense in say CP77 as you have robot eyes)…
… they at least usually don’t nuke your performance.
Motion blur, DoF and ray tracing almost always do.
Hairworks? Seems to be a complete roll of the dice between the specific game and your hardware.
Motion Blur and depth of field has almost no impact on performance. Same with Anisotropic Filtering and I can not understand why AF isn’t always just defaulted to max, since even back in the golden age of gaming it had no real performance impact on any system.
You either haven’t been playing PC games very long, or aren’t that old, or have only ever played on fairly high end hardware.
Anisotropic filtering?
Yes, that… hasn’t been challenging for an affordable PC an average person has to run at 8x or 16x for … about a decade. That doesn’t cause too much framerate drop off at all now, and wasn’t too much until you… go all the way back to the mid 90s to maybe early 2000s, when ‘GPUs’ were fairly uncommon.
But that just isn’t true for motion blur and DoF, especially going back further than 10 years.
Even right now, running CP77 on my steam deck, AF level has basically no impact on my framerate, whereas motion blur and DoF do have a noticable impact.
Go back even further, and a whole lot of motion blur/DoF algorithms were very poorly implemented by a lot of games. Nowadays we pretty much get the versions of those that were not ruinously inefficient.
Try running something like Arma 2 with a mid or low range PC with motion blur on vs off. You could get maybe 5 to 10 more fps having it off… and thats a big deal when you’re maxing out at 30 to 40ish fps.
(Of course now we also get ghosting and smearing from framegen algos that ironically somewhat resemble some forms of motion blur.)
I am 40 and have been gaming on PC my entire life.
Try running something like Arma 2 with a mid or low range PC with motion blur on vs off. You could get maybe 5 to 10 more fps having it off… and thats a big deal when you’re maxing out at 30 to 40ish fps.
Arma is a horrible example, since it is so poorly optimized, you actually get a higher frame rate maxing everything out compared to running everything on low. lol
If you’re 40 and have been PC gaming your whole life, then I’m going with you’ve had fairly high end hardware, and are just misremembering.
Arma 2 is unoptimized in general… but largely thats because it basically uses a massive analog to a pagefile on your HDD because of how it handles its huge environments in engine. Its too much to jam through 32 bit OSs and RAM.
When SSDs came out, that turned out to be the main thing that’ll boost your FPS in older Arma games, because they have much, much faster read/write speeds.
… But, their motion blur is still unoptimized and very unperformant.
As for setting everything to high and getting higher FPS… thats largely a myth.
There are a few postprocessing settings that work that way, and thats because in those instances, the ‘ultra’ settings actually are different algorithms/methods, that are both less expensive and visually superior.
It is still the case that if you set texture, model quality to low, grass/tree/whatever draw distances very short, you’ll get more frames than with those things maxxed out.
I love it when the hair bugs out and covers the whole distance from 0 0 0 to 23944 39393 39
I’d add Denuvo to that list. Easily a 10-20% impact.
Unfortunately that’s not a setting most of us can just disable.
/c/crackwatch@lemmy.dbzer0.com sure you can
These settings can be good, but are often overdone. See bloom in the late 2000s/early 2010s.
Yeah, chromatic aberration when done properly is great for emulating certain cameras and art styles. Bloom is designed to make things look even brighter and it’s great if you don’t go nuts with it. Lens flares are mid but can also be used for some camera stuff. Motion blur is generally not great but that’s mainly because almost every implementation of it for games is bad.
Also the ubiquitous “realistic” brown filter a la Far Cry 2 and GTA IV. Which was often combined with excessive bloom to absolutely destroy the player’s eyes.
At least in Far Cry 2 you are suffering from malaria.
I always hated bloom, probably because it was overused. As a light touch it can work, but that is rarely how devs used it.
It’s usually better in modern games. In the 2005-2015 era it was often extremely overdone, actually often reducing the perceived dynamic range instead of increasing it IMO.
All those features sucked when they first came out, not just bloom.
I like DoF as it actually has a purpose in framing a subject. The rest are just lazy attempts at making the game “look better” by just slopping on more and more effects.
Current ray tracing sucks because its all fake AI bullshit.
The only game with Raytracing I’ve seen actually look better with RT on js Cyberpunk 2077. It’s the only game I’ve seen that has full raytraced reflections on surfaces. Everything else just does shadows, and there’s basically no visual difference with it on or off; it just makes the game run slower when on.
But those reflections in CP are amazing as fuck. Seeing things reflect in real time off a rained on road is sick.
I agree with this. That makes it even more jarring to me that mirrors inside of safehouses don’t work until you specifically interact with them. It seems so out of place in a game that has all of these cool raytraced reflections except for a mirror you directly look into.
I just don’t see them as mirrors. They are video screens with a camera in them. ;)
It’s also connected to a performance feature. They can load lower resolution textures for faraway objects. You can do this without the blurring effect of DoF, but it’s less jarring if you can blur it.
The cost of DoF rendering far outweighs the memory savings of using reduced texture sizes, especially on older hardware where memory would be at a premium
I think Halo Infinite has a good example of a limited ray traced effect (the shadows) and an example of a terrible DoF effect (it does not look realistic at all or visually appealing)
I don’t understand who decided that introducing the downfalls of film and camera made sense for mimicking the accuracy and realism of the human eye
I don’t think it’s to make it fee realistic, it’s more so to feel like it’s a film that is being shot.
Which is stupid and immersion breaking
Oh I 100% agree
motion blur is essential for a proper feeling of speed.
most games don’t need a proper feeling of speed.
Motion blur is guarenteed to give me motion sickness every time. Sometimes I forget to turn it off on a new game… About 30 minutes in I’ll break into cold sweats and feel like I’m going to puke. I fucking hate that it’s on by default in so many games.
It really should be a prompt at first start. Like, ask a few questions like:
- do you experience motion sickness?
- do you have epilepsy?
The answers to those would automatically disable certain settings and features, or drop you into the settings.
It would be extra nice for a platform like PlayStation or Steam to remember those preferences and the game could read them (and display a message so you know it’s doing it).
Motion blur + low FOV is an instant headache.
… What?
I mean… the alternative is to get hardware (including a monitor) capable of just running the game at an fps/hz above roughly 120 (ymmv), such that your actual eyes and brain do real motion blur.
Motion blur is a crutch to be able to simulate that from back when hardware was much less powerful and max resolutions and frame rates were much lower.
At highet resolutions, most motion blur algorithms are quite inefficient and eat your overall fps… so it would make more sense to just remove it, have higher fps, and experience actual motion blur from your eyes+brain and higher fps.
my basis for the statement is beam.ng. at 100hz, the feeling of speed is markedly different depending on whether motion blur is on. 120 may make a difference.
You still see doubled images instead of a smooth blur in your peripheral vision I think when you’re focused on the car for example in a racing game.
There is always motion blur if your monitor is shitty enough.
Or your brain slow enough
Or the drugs good enough
Damn this Panzerschokolade be hitting different.
Just turn on TAA and free motion blur in any game!
yeah the only time I liked it was in need for speed when they added nitro boost. the rest of the options have their uses imo I don’t hate them.
Out of all of these, motion blur is the worst, but second to that is Temporal Anti Aliasing. No, I don’t need my game to look blurry with every trailing edge leaving a smear.
TAA is kind of the foundation that almost all real time EDIT:
raytracingframe upscaling and frame generation are built on, and built off of.This is why it is increasingly difficult to find a newer, high fidelity game that even allows you to actually turn it off.
If you could, all the subsequent
magicbullshit stops working, all the hardware in your GPU designed to do that stuff is now basically useless.EDIT: I goofed, but the conversation thus far seems to have proceeded assuming I meant what I actually meant.
Realtime raytracing is not per se foundationally reliant on TAA, DLSS and FSR frame upscaling and later framgen tech however basically are, they evolved out of TAA.
However, without the framegen frame rate gains enabled by modern frame upscaling… realtime raytracing would be too ‘expensive’ to implement on all but fairly high end cards / your average console, without serious frame rate drops.
Befor Realtime raytracing, the paradigm was that all scenes would have static light maps and light environments, baked into the map, with a fairly small number of dynamic light sources and shadows.
With Realtime raytracing… basically everything is now dynamic lights.
That tanks your frame rate, so Nvidia then barrelled ahead with frame upscaling and later frame generation to compensate for the framerate loss that they introduced with realtime raytracing, and because they’re an effective monopoly, AMD followed along, as did basically all major game developers and many major game engines (UE5 to name a really big one).
What? All Ray Tracing games already offer DLSS or FSR, which override TAA and handle motion much better. Yes, they are based on similar principles, but they aren’t the mess TAA is.
Almost all implementations of DLSS and FSR literally are evolutions of TAA.
TAA 2.0, 3.0, 4.0, whatever.
If you are running DLSS or FSR, see if your game will let you turn TAA off.
They often won’t, because they often require TAA to be enabled before DLSS or FSR can then hook into them and extrapolate from there.
Think of TAA as a base game and DLSS/FSR as a dlc. You very often cannot just play the DLC without the original game, and if you actually dig into game engines, you’ll often find you can’t run FSR/DLSS without running TAA.
There are a few exceptions to this, but they are rare.
Honestly motion blur done well works really well. Cyberpunk for example does it really well on the low setting.
Most games just dont do it well tho 💀
But why did you buy a 1800€ video card then?
So you can use a more demanding form on anti-aliasing, that doesn’t suck ass?
Like rip maps instead of mip maps? Or is it some AI bs nowadays?
Unreal doesn’t even have other forms of AA iirc. It’s up to the devs to implement
farming bitcoin. Duh.