![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://beehaw.org/pictrs/image/c0e83ceb-b7e5-41b4-9b76-bfd152dd8d00.png)
Okay. Good for China?
This seems like a really weird way to say “EU countries aren’t investing enough into green tech”.
Okay. Good for China?
This seems like a really weird way to say “EU countries aren’t investing enough into green tech”.
Depends on the specifics. My high-end MacBook Pro uses active cooling, but in practice it almost never comes on. It’s wayyyyy more efficient than the previous Intel gen.
A week or two ago, I accidentally left a Python process running using 100% of a single core. I didn’t even notice for several hours, until it ate up all my RAM. On on Intel laptop the fan would’ve let me know in like two minutes.
I don’t think Qualcomm’s actually caught up to Apple yet, but it’s getting close. It’s good to see more competition.
For all the talk of regulating AI, I think the only meaningful regulation is very simple: hold the people implementing it accountable.
You want to use AI instead of a real certified professional? Go nuts. Let it write your legal contracts, file your taxes, diagnose your patients. But be prepared to get sued into oblivion when it makes a mistake that real professionals spend years of expensive training learning to avoid. Let the insurance industry do the risk assessment and see how unviable it is to replace human experts when there’s human accountability.
BTRFS also supports deduplication, but not automatically. duperemove
will do it and you can set it up on a cron task if you want.
How so? Perhaps I’m misremembering, but they were born on Earth and raised among humans, right? Does that not say something about the human culture of their time?
It was presented as exceptional in-universe, from Adira’s perspective. The fact that Adira felt weird about it at all paints the culture they grew up in as backwards.
The problem I had with that scene (and the whole series, really, especially season 3) was that it framed human culture of the future as being generally oppressive and backwards. Acceptance shouldn’t be portrayed as radical or exceptional. It should be normal and taken for granted among humans in the future. Like in TOS, Uhura’s role was a big deal for viewers specifically because it was not a big deal for the characters. They just showed us a better future, where a black woman in a respected professional position was normal.
Discovery didn’t show us a better future. It showed us a shitty future with a handful of decent people in it. This is just one example, but it’s one that stuck in my mind as well.
Anyone know what charging protocol they use for 80W? This article and the official web site do not specify. Is it USB-PD? SuperVOOC? I’m not really familiar with Vivo specifically.
If it’s USB-PD then that means you could use a typical laptop charger. If it’s VOOC, then it’s unlikely you’ll have any compatible chargers.
That would be a somewhat valid argument if Snaps “just worked” any better than Flatpaks. That has not been my experience.
Given the choice between an open standard and a proprietary one, the proprietary one damn well better have meaningful technological advantages. I don’t see that with Snaps. All I see is a company pouring effort into a system whose only value is that they are pouring effort into it. They should put that effort into something better.
Granted, it’s been a few years since I used Ubuntu and Snaps. Perhaps things have improved. It was nothing but headaches for me. A curse upon whoever decided to package apps that obviously require full file system access as Snaps. “User-friendly”, indeed.
From an enterprise/server perspective, when what you’re really paying for is first-party support, I guess Snaps make more sense. But again, that effort could be put toward something more useful.
Am I out of touch with Qualcomm’s increasingly confusing naming schemes, or is that awfully expensive for a 7sG2?
What’s this? A software app store?
It’s ironic how on Linux, my distro’s app repository is always my first stop when looking for software, while on Mac or Windows it’s my last resort.
Commercialized app stores are full of spam, and Microsoft and Apple both decided that app store apps should not have the full capabilities of normal apps. It’s the exact opposite on Linux.
Thanks for the recommendation! I was looking at the Fedora family since AMD officially supports RHEL 9. Hadn’t gotten as far as to figure out how well that transfers to Fedora and its derivatives. Good to hear that it works.
If you’re only testing on one set of hardware, it isn’t going to tell the whole story. The results might be very different on an AMD vs Nvidia GPU, or even on a brand-new vs 1-3 generation old GPU.
Probably the most important thing for gaming is driver support and ease of installation. This sometimes runs directly counter to other general-purpose needs.
I’m still on the hunt for a distro where everything I need is easy to install. I don’t think any exist, primarily because GPU drivers suuuuuuuck, especially when you need CUDA or ROCm to work.
This is the great thing about open source. It benefits everyone. Any good idea that does not have significant drawbacks should get broad adoption. And that’s generally how it plays out.
Reputations live on for many years (decades, even) after they are justified.
Emulation.
Definitely going to incur a performance hit relative to native code, but in principle it could be perfectly good. It’s not like the GPU is running x86 code in the first place. On macOS, Apple provides Rosetta to run x86 Mac apps, and it’s very, very good. Not sure how FEX compares.
Correct.
Batteries will still lose charge very slowly, so at some point the battery controller will top itself back up. This is nothing to worry about, and I’m not sure macOS (or Linux) will every display the true charge level of a battery. I believe there is some wiggle room built in at the firmware level.
When MacBooks are plugged in, they get their power from the charger. They are not simultaneously draining and charging the battery in general, unless they need more power than the charger can provide (unlikely unless you are using a charger with lower wattage than the official charger that came with your laptop).
I was not able to find an official source on this from a quick search, but if I remember correctly, this should be true for any moderately recent MacBook. Maybe any MacBook at all, since they only started making “MacBooks” in 2006 and then tech hasn’t changed much since then.
Personally, I leave my MBP plugged in during use whenever possible, and I typically unplug it at the end of the day. You don’t need to unplug it, but hey, it’s a good idea to unplug anything that doesn’t need to be plugged in, just to save power.
This is my plan A. I’ll only go to plan B if something goes wrong — which has happened to me a couple times. I tried to upgrade Ubuntu (LTS, I forget which version) years ago, but it failed hard. I still don’t know why. It wasn’t something I could figure out in half an hour, and it wasn’t worth investing more time than that.
Come to think of it, it’s possible all my upgrade woes came down to Nvidia drivers. It was a common problem on Suse (TW), to the point where I pinned my kernel version to avoid the frequent headaches. I’ll try a rolling distro again when I switch to AMD, maybe.
Pulaski had interesting dynamics with almost every other character. I think she was written very well, especially for such a short tenure. Crusher was largely neglected by the writers.
These don’t seem like competing needs. When I think “just work with minimal hassle”, I don’t think “I need to restrict myself to outdated hardware”.
I’m perfectly happy running old packages in general. I’m still on Plasma 5, and it works just as well as it did last year. But that’s a matter of features, not compatibility. Old is fine; broken is not.