If you have a defensive browser that runs over Tor and blocks popups, CAPTCHAs, dark-pattern-loaded cookie walls, and various garbage, we still end up at the losing end of the arms race. The heart of the problem is that privacy enthusiasts are exposed to the same search engine rankings that serve the privacy-naïve/unconcerned masses.
Would it make sense for the browser to autodetect various kinds of enshitification, add the hostname to a local db for future use, then report the hostname anonymously over Tor to central db that serves as an enshitification tracker? The local and centralized DBs could be used to down-rank those sites in future results. And if a link to enshitified sites appears on a page unrelated to searches it could be cautioned with a “⚠”. Some forms of enshitification would probably need manual detection but I could see people being motivated to contribute.
The security and integrity of a centralized db would perhaps be the hardest part of the effort. But if that could be sorted out, we could get search results to prioritize (pro-user) resources. In principle the DB could also track access methods by which a website is garbage-free (e.g. if the garbage does not manifest when viewed in Lynx, then that should be captured in the DB as well).