So a user on Reddit (ed: u/Yoraxx ) posted on the Starfield subreddit that there was a problem in Starfield when running the game on an AMD Radeon GPU. The issue is very simple, the game just won’t render a star in any solar system when you are at the dayside of a moon or even any planetary object. The issue only occurs on AMD Radeon GPUs and users with the Radeon RX 7000 & RX 6000 GPUs are reporting the same thing.
The issue is that the dayside of any planetary body or moon needs a source of light that gets it all lit up. That source is the star and on any non-AMD GPU, you will see the star/sun in the sky box which will be illuminating light on the surface. But with AMD cards, the star/sun just isn’t there while the planet/moon remains illuminated without any light source.
And here I was just reading that AMD GPUs showed much better performance in Starfield. Maybe it’s because they’re just not rendering stuff at all lmao
Waaaaait… it was a bug and not gross incompetence?
“Bethesda’s Bug”, when you can’t tell if something isn’t working correctly or if it’s just not implemented at all.
I don’t think we know.
Makes me wonder of the dev team is on a much-needed vacation or if they only run nvidia gpus. lol
The game runs better on AMD, and Bethesda partnered with AMD in some way for this PC release.
Does it run better by not rendering light emitting objects?
Perhaps. Who needs stars anyway?
All GPUs perform equally well the same at ray tracing when there are no rays to trace
That really just means AMD gave them a lot of money, and they just made sure FSR2 worked. lol
#include “fsr2.h”
Ok, can we have the monies please?
I’ve got a 7900XTX Ultra, and FSR2 does literally nothing, which is hilarious.
100% resolution scale, 128 FPS.
75% resolution scale … 128 FPS.
50% resolution scale, looking like underwater potatoes … 128 FPS.
I don’t know how it’s possible to make an engine this way, it seems CPU-bound and I’m lucky that I upgraded my CPU not too long ago, I’m outperforming my friend who has an RTX 4090 in literally all scenes, indoor, ship, and outdoor/planet.
He struggles to break 70 FPS on 1080p Ultra, meanwhile I’m doing 4K Ultra.
Creation Engine has always been cpu-bound since gamebryo era.
I have noticed it’s better anti-aliasing than the forced TAA (once I forced it off)
Some of the benchmarks definitely pointed out that it was CPU bound in many areas (eg. the cities).
I think the HUB one mentioned that some of the forested planets were much more GPU bound and better for testing.
I’m on a tv so capped at 60fps, but I do see a power usage difference with FSR - 75% vs FSR- 100% that’s pretty substantial on my 7900xt.
It can be both
If it’s down to very specific Chipsets, that sounds like an unforseeable bug.
An unseeable unforeseeable bug?
Correction: someone pointed out they are literally interfacing the graphics drivers the wrong way, so it’s still on the their Devs.
I had no idea it was a problem on Radeon GPUs. I saw a few people complaining about not seeing the stars, but I didn’t have a clue what they were talking about since it was always fine for my Nvidia card.
Can confirm it’s the same on Proton / Linux. This game keeps being a joke on the technical side.
Now it is just Field.
So fitting that this is posted in this Lemmy instance.
Funny I noticed this on New Atlantis and just chalked it up to the devs being lazy
Ugh. A part of me wants to give AMD a chance for my next upgrade and push back against Nvidia’s near-monopoly of GPUs but I really don’t want to deal with how everything kinda-sorta works on Radeons.
I’ve exclusively been on AMD since like 2015 and my GPUs “kinda-sorta working” has not been my experience at all lol. Literally have never had brand-specific problems. The only brand-specific issues I’ve had were trying to get my laptop with an Nvidia GPU to work properly under Linux.
I’ve exclusively used AMD GPU’s since building my first PC 27 years ago. I’m not aware of things “kinda-sorta” not working.
This issue also occurs on Intel Arc Alchemist and Nvidia Maxwell
I’ve been red only in my rig for over a decade and the only problems I’ve had are that I play the same games as everyone else perfectly fine and I have more money in my wallet due to not spending as much on parts. That and the bulldozer generation CPUs heated my house like crazy, there’s no denying that lol
Ugh… the last part is still happening? Like are the new CPUs also so hot or whatever would somebody call it?
I am tempted to build a new PC all AMD for costs alone although the AM4 probably won’t last as long as the Am3 did sadly. But the summer is already terrible with my Intel… no need for more heat.
No bulldozer chips have been gone for like 6-7 years. They last two ryzen generations have been far more energy/heat efficient than intel. Ryzen is the better choice by far right now
Current Intel is worse than current AMD for CPU heat and Nvidia is currently cooler than AMD on GPU. Also we’re on AM5. AM4 lived for a relatively long time, no indication that AM5 won’t be a long runner as well. Intel changes socket more often as well so for longevity AMD is almost always the best, except at the tail end of a socket.
Huh. Didn’t even know they replaced am4 until this comment 😂 my am4 ryzen 5 paired with an rx6700xt still does everything I want it to do. And if it starts slacking I have plenty of upgrading left to do.
I have a suspicion that developers do less testing, optimization, and bugfixing for AMD cards due to reduced market share and that’s why more of these brand-specific coding errors slip through for them. It’s unfortunate but I can’t deny I’ve seen some weird bugs in my time.
How can an AMD sponsored game that litteraly runs better on all AMD GPU vs their NVIDIA counterpart, doesn’t embark any tech that may unfavor AMD GPU can be less QA-ed on AMD GPUs because of market share?
This game IS better optimized on AMD. It has FSR2 enabled by default on all graphics presets. That particular take especially doesn’t work for this game.
Some games are built specifically for AMD from the ground up and have been optimized like crazy. Depends on the game and the devs mostly. And let’s not forget that if devs want it to run well on PS5 and Xbox Series x/s, then they better have good AMD optimization.
And, being Bethesda, it’s not like bugs are unexpected.
Oh of course. I don’t actually blame AMD for those kinds of bugs. But it’s the reality as a user, at least in my experience… but it’s been like stupid long time since I’ve used a machine with an AMD card.
You make it sound like nvidia has never pushed out a kinda sorta works driver.