• 1 Post
  • 1.28K Comments
Joined 1 year ago
cake
Cake day: June 8th, 2023

help-circle









  • I gave a quick overview in another post, but in case you’d like some guidance unique to your repair, post pictures with the legs of the port and the spot it goes in with good enough focus and zoom that I can see the traces and pads and legs.

    I do this kind of repair work every day, if everything is in good shape it’s no problem but if you have a damaged port or ripped pads/traces then you’ll need to change tack a little.



  • You’re gonna need a hot air station. If it’s actually usb3 and not just usb in a c sized connector then you usually wont be able to do it with just an iron.

    The process is:

    Clean the site and all through holes (they won’t be for usb pins but instead for physical connection). Use flux.

    Reapply solder to all surface mount pads.

    Preheat the area.

    Apply flux.

    Manipulate the new port into place. If you can, tack it down at a few spots.

    Reflow all the pins. Use flux.

    Have you confirmed that the port itself isn’t damaged? The pads it connects to?



  • Im making this reply to help you better respond to people, not to start a fight:

    Hey, just a heads up, your cfr link directly cites Adrian zenz, the person many people don’t trust to make even handed statements about China, directly twice.

    The vox link sources statements from him several times. I tried to just quickly parse what was what but I came up with seven different statements.

    The bbc article seems to reference zenz in six different claims.

    I wasn’t able to give these articles a deep read, or check if the other sources also pull from that particular controversial figure or his organizations.





  • Short answer: no.

    Long answer: also no, but in some specific circumstances yes.

    Your display uses energy to do two things, change the color you see and make them brighter or dimmer. It honestly speaking has a little processor in it but that sucker is so tiny and energy efficient that it’s not affecting things much and you can’t affect it anyway.

    There’s two ways to do the things your display does, one way is to have a layer of tiny shutters that open up when energized and allow light through their red, blue or green tinted windows in front of a light source. In this case you can use two techniques to reduce the energy consumption: open fewer shutters or reduce the intensity of the light source. Opening fewer shutters seems like it would be part of lowering the resolution, it when you lower the resolution you just get more shutters open for one logical “pixel” in the framebuffer (more on that later).

    Another way to do what your display does is to have a variable light source behind each tinted window and send more or less luminance through each one. In this case there is really only one technique you can use to reduce the energy consumption of the display, and that’s turning down the brightness. This technique has the same effect as before when you lower the resolution. It’s worth noting that a “darker” displayed image will consume less energy in this case, so if you have an oled display, consider using a dark theme!

    So the display itself shouldn’t save energy with a lowered resolution.

    Your gpu has a framebuffer, which is some memory that corresponds to the display frame. If that display is running at a lower resolution, the framebuffer will be smaller and if it it’s running at higher resolution it’ll be bigger. Memory is pretty energy efficient nowadays, so the effect of a larger framebuffer on energy consumption is negligible.

    Depending on your refresh rate, the framebuffer gets updated some number of times a second. But the gpu doesn’t just completely wipe and rewrite and resend the framebuffer, it just changes stuff that needs it, so when you move your mouse at superhuman speeds exactly one cursor width to the left in one sixtieth of a second, the framebuffer updates two cursor area locations in the framebuffer, the place the cursor was gets updated to reflect whatever was underneath and the place the cursor is gets updated with a cursor on it.

    Okay but what if I’m doing something that changes the whole screen at my refresh rate? In that case the whole framebuffer gets updated!

    But that doesn’t often happen…

    Let’s say you’re watching a movie. It’s 60fps source material, so wouldn’t the framebuffer be updating 60 times a second? No! Not only is the video itself encoded to reflect that colors don’t change from frame to frame and that the thing decoding them doesn’t need to worry about those parts, the thing decoding them is actively looking for even more ways to avoid doing the work of changing parts of the framebuffer.

    So the effect of a larger framebuffer on battery is minimized while playing movies, even when the frame buffer is huge!

    But actually decoding a 3k movie is much more cpu intensive than 1080. So maybe watch in 1080, but that’s not your display or resolution, it’s the resolution of the source material.

    Okay, but what about games? Games use the framebuffer too, but because they aren’t pre-encoded, they can’t take advantage of someone having already done the work of figuring out what parts are gonna change and what parts are. So you pop into e1m1 and the only way the computer can avoid updating the whole framebuffer is when the stuff chocolate doom sends it doesn’t change the whole framebuffer, like those imps marching in place.

    But chocolate doom still renders the whole scene, making use of computer resources to calculate and draw the frame and send it to the framebuffer which looks up and says “you did all this work to show me imp arms swinging over a one inch square portion of screen area”?

    But once again, chocolate doom takes more computer resources to render a 3k e1m1 than one in 1080, so maybe turn down your game resolution to save that energy.

    Hold on, what about that little processor on the display? Well it can do lots of stuff but most of the time it’s doing scaling calculations so that when you run chocolate doom full screen at 1080 the image is accurately and as nicely as possible scaled across the whole screen instead of stuck at the top left or in the middle or something. So in that case you could actually make that little sucker do less work and take up less energy by running at the displays “native” resolution than if you were at 1080.

    So when jigsaw traps you in his airport terminal shaped funhouse and you wake up with the exploder on your neck and a note in front of you that says “kill carmack” and no charger in your bag, yes, you will save energy running at a lower resolution.

    E: running chocolate doom at a lower resolution, not the display.



  • You mainly want to be able to do 3d and video editing right?

    Those two, specifically with davinci resolve and blender, work best with nvenc and libcuda(?), the software libraries that let you take advantage of your nvidia cards encoders and cuda cores.

    So if you were building for that workload, you’d have an nvidia card and many problems people encounter in Wayland come from using it with an nvidia card.

    So yeah it’s the nvidia support. Most people will say “fuck nvidia, just don’t buy their hardware” but it’s the best choice for you and would be a huge help, so choosing between Wayland and nvidia is a no brainer.

    It is a bummer that you’ll need to install x specially, but I’d be really surprised if there isn’t decent support for that.

    There’s always the hope that Wayland will get better over time and you’ll be able to use it in a few years.

    E: a word on encoding: both amd and intel CPU’s have video encode and decode support, but the intel qsv is more widely supported and tends to be faster most of the time. When people suggest intels arc gpus they’re saying it because those gpus use qsv and for a video editing workstation they’d be a good choice.

    Part of the reason I put intel and amd cpus on an even footing for you is because any cost savings you get from going amd would likely be offset by the performance decrease. Theres some good breakdowns of cpu encoder performance out there if you want to really dive in, it remember that you’re also in a good place to buy intel because of the crazy deals from sky is falling people.

    That kinda ties into the cores over threads thing too. If your computers workload is a bunch of little stuff then you can really make hay of using a scheduler that is always switching stuff around. One of the things that makes amds 3d processors so good at that stuff is that they have a very big cache so they’re able to extend the benefit of multi threading schedulers up to larger processes. You’re looking at sending your computer a big ol’ chunk of work though, so you’re not usually gonna be multithreading with that powerful scheduler and instead just letting cores crunch away.

    Part of the reason I didn’t suggest intels arc stuff is that you’re also doing 3d work and being able to take advantage of the very mature cuda toolchain is more important.

    Plus nvidia encoding is also great and if you were to pair it with an intel cpu you could have the best of both worlds.

    You’re really looking to build something different than most people and that’s why my advice was so against the grain. Hope you end up with a badass workstation.