So a user on Reddit (ed: u/Yoraxx ) posted on the Starfield subreddit that there was a problem in Starfield when running the game on an AMD Radeon GPU. The issue is very simple, the game just won’t render a star in any solar system when you are at the dayside of a moon or even any planetary object. The issue only occurs on AMD Radeon GPUs and users with the Radeon RX 7000 & RX 6000 GPUs are reporting the same thing.
The issue is that the dayside of any planetary body or moon needs a source of light that gets it all lit up. That source is the star and on any non-AMD GPU, you will see the star/sun in the sky box which will be illuminating light on the surface. But with AMD cards, the star/sun just isn’t there while the planet/moon remains illuminated without any light source.
Waaaaait… it was a bug and not gross incompetence?
“Bethesda’s Bug”, when you can’t tell if something isn’t working correctly or if it’s just not implemented at all.
I don’t think we know.
Makes me wonder of the dev team is on a much-needed vacation or if they only run nvidia gpus. lol
The game runs better on AMD, and Bethesda partnered with AMD in some way for this PC release.
Does it run better by not rendering light emitting objects?
Perhaps. Who needs stars anyway?
All GPUs perform equally well the same at ray tracing when there are no rays to trace
That really just means AMD gave them a lot of money, and they just made sure FSR2 worked. lol
I’ve got a 7900XTX Ultra, and FSR2 does literally nothing, which is hilarious.
100% resolution scale, 128 FPS.
75% resolution scale … 128 FPS.
50% resolution scale, looking like underwater potatoes … 128 FPS.
I don’t know how it’s possible to make an engine this way, it seems CPU-bound and I’m lucky that I upgraded my CPU not too long ago, I’m outperforming my friend who has an RTX 4090 in literally all scenes, indoor, ship, and outdoor/planet.
He struggles to break 70 FPS on 1080p Ultra, meanwhile I’m doing 4K Ultra.
Creation Engine has always been cpu-bound since gamebryo era.
I have noticed it’s better anti-aliasing than the forced TAA (once I forced it off)
Some of the benchmarks definitely pointed out that it was CPU bound in many areas (eg. the cities).
I think the HUB one mentioned that some of the forested planets were much more GPU bound and better for testing.
I’m on a tv so capped at 60fps, but I do see a power usage difference with FSR - 75% vs FSR- 100% that’s pretty substantial on my 7900xt.
#include “fsr2.h”
Ok, can we have the monies please?
It can be both
If it’s down to very specific Chipsets, that sounds like an unforseeable bug.
An unseeable unforeseeable bug?
Correction: someone pointed out they are literally interfacing the graphics drivers the wrong way, so it’s still on the their Devs.
I had no idea it was a problem on Radeon GPUs. I saw a few people complaining about not seeing the stars, but I didn’t have a clue what they were talking about since it was always fine for my Nvidia card.