- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Looks nice but I kinda lowkey kinda despise this technology acting as a substitute for optimized graphics rendering. It’s gonna be necessary for full raytracing but idk it still just rubs me the wrong way
I genuinely don’t understand this sentiment.
What makes a DLSS frame so different from a native frame? It’s all just running code to turn math into pixels.
The only thing that matters is the end result to our human eyeballs, not how we get there. DLSS (and the other options to varying degrees) has gotten so good it looks better than native, especially when in motion.
Its never gonna look perfect and you get shitty performance in the game if ypu dont have an nvidia card, that supports it.
This sucks for all the potential customers who doesnt have a supported card.
Im one of them
I would rather see an optimized game.
How can a game like doom eternal get 200 fps and on the same hardware cyberpunk gets like 60 (im just throwing numbers out there but generally doom eternal can get decent fps on anything)
Theres a clear lack of focus by the cyberpunk devs, IMO.
you can’t compare an open world game with a more linear first person. there’s different performance profiles.
compared to the unreal engine, red engine does open world games pretty well without traversal slowdown or shader complication
2077 is larger, more NPCs, vehicles, more complex materials and geometry, human NPCs have higher bar for fidelity than monsters.
It’s a nice way of future proofing imho. Someone buying the game on a budget card five years from now will see a prettier game than one running it on a good card today.
As long as the game runs okay now, which Cyberpunk failed at. But in principle I’m all for it.
If only we could have this AND a well optimized game for really great frame rates.
it’s a tool that can be used and misused.
cyberpunk noted by digital foundry is actually a scalable game
I really hope I can get it looking great with my 3080 but the spec sheet left me with some doubt for sure. It’s weird to me that a 2 year old game needs current hardware too. Gonna be a long month to find out.
It’s the new Crysis. Even SLIed 8800s struggled to run the OG.
https://hothardware.com/news/crysis-sp-geforce-8800-gt-and-sli-peformance
Yea I could see that. PC gaming was beyond me back then but I remember all the jokes about it. That’s a good way for me to look at it though, I built a PC for this game and it’s been frustrating coming up short graphically. Easier to swallow if I look at it as an outlier like Crysis for sure.