What a lovely fucking precedent to have.
What a lovely fucking precedent to have.
Dualbooting is possible and easy: just gotta shrink the Windows partition and install Linux next to it. Make sure to not format the whole thing by mistake, though. A lot of Linux installers want to format the disk by default, so you have to pick manual mode and make sure to shrink (not delete and re-create!) the windows partition.
As for its usefulness, however… Switching the OS is incredibly annoying. Every time you want to do that you have to shut down the system completely and boot it back up. That means you have to stop everything you’re doing, save all the progress, and then try to get back to speed 2 minutes later. After a while the constant rebooting gets really old.
Furthermore, Linux a completely different system that shares only some surface level things with Windows. Switching to it basically means re-learning how to use a computer almost from scratch, which is, also, incredibly frustrating.
The two things combined very quickly turn into a temptation to just keep using the more familiar system. (Been there, done that.)
I think I’ll have to agree with people who propose Virtual Machines as a solution.
Running Linux in a VM on Windows would let you play around with it, tinker a little and see what software is and isn’t available on it. From there you’ll be able to decide if you’re even willing to dedicate more time and effort to learning it.
If you decide to continue, you can dual boot Windows and Linux. But not to be able to switch between the two, but to be able to back out of the experiment.
Instead, the roles of the OSes could be reversed: a second copy of Windows could be install in a VM, which, in turn, would run on Linux.
That way, you’d still have a way to run some more picky Windows software (that is, software that refuses to work in Wine) without actually booting into Windows.
This approach would maximize exposure to Linux, while still allowing to back out of the experiment at any moment.
Wayland has it’s fair share of problems that haven’t been solved yet, but most of those points are nonsense.
If that person lived a little over a hundred years ago and wrote a rant about cars vs horses instead, it’d go something like this:
Think twice before abandoning Horses. Cars break everything!
Cars break if you stuff hay in the fuel tank!
Cars are incompatible with horse shoes!
You can’t shove your dick in a car’s mouth!
The rant you’re linking makes about as much sense.
In case of Gnome it was addressed, just by different people. Gnome 2 continues to live on as MATE, so anyone who doesn’t like Gnome 3 can use it instead.
To provide features that Xorg can’t.
If you don’t need features like fractional scaling, VRR, touchscreen gestures, etc. you won’t notice a difference.
People who do use those, will. Because for them, those features would be missing or not complete on Xorg.
To be honest, most things in Nobra can be installed/done to regular Fedora. And, unlike Nobra, Fedora has more than 1 maintainer: goof for the bus factor.
Focusing on the things I need to actually do.
I swear, if even if I was forced to do something at gunpoint, I’d manage to get distracted anyway.
Almost everything that’s not Gnome can be considered lightweight, to be honest.
“Our goal is knowledge, so we’re going to obfuscate everything to fuck and make things unreadable”
1k USD. Should be enough to leave my shithole of a country, if I’m lucky.
Corporations have been trying to control more and more of what users do and how they do it for longer than AI has been a “threat”. I wouldn’t say AI changes anything. At most, maybe, it might accelerate things a little. But if I had to guess, the corpos are already moving as fast as they can with locking everything down for the benefit of no one, but them.
“AI” models are, essentially, solvers for mathematical system that we, humans, cannot describe and create solvers for ourselves.
For example, a calculator for pure numbers is a pretty simple device all the logic of which can be designed by a human directly. A language, thought? Or an image classifier? That is not possible to create by hand.
With “AI” instead of designing all the logic manually, we create a system which can end up in a number of finite, yet still near infinite states, each of which defines behavior different from the other. By slowly tuning the model using existing data and checking its performance we (ideally) end up with a solver for some incredibly complex system.
If we were to try to make a regular calculator that way and all we were giving the model was “2+2=4” it would memorize the equation without understanding it. That’s called “overfitting” and that’s something people being AI are trying their best to prevent from happening. It happens if the training data contains too many repeats of the same thing.
However, if there is no repetition in the training set, the model is forced to actually learn the patterns in the data, instead of data itself.
Essentially: if you’re training a model on single copyrighted work, you’re making a copy of that work via overfitting. If you’re using terabytes of diverse data, overfitting is minimized. Instead, the resulting model has actual understanding of the system you’re training it on.
So… You say nothing will change.
OpenSUSE + KDE is a really solid choice, I’d say.
The most important Linux advice I have is this: Linux isn’t Windows. Don’t expect things to works the same.
Don’t try too hard to re-configure things that don’t match the way things are on Windows. If there isn’t an easy way to get a certain behavior, there’s probably a reason for it.
If it’s the data side that got damaged, you might be able to restore the disk, as long as the damage is not major. The actual data is written on a thin film that’s sandwiched between two layers of plastic. The plastic on the outside can be ground down and polished back to a smooth, clean finish. Disk polishers used to be kinda popular back in the day.
I have a 120 gig SSD. The system takes up around 60 gigs + BTRFS snapshots and its overhead. A have around 15 gigs of wiggle room, on average. Trying to squeeze some /home stuff in there doesn’t really seem that reasonable, to be honest.
As long as you don’t re-format the partition. Not all installers are created equal, so it might be more complicated to re-install the OS without wiping the partition entirely. Or it might be just fine. I don’t really install linux often enough to know that. ¯\_(ツ)_/¯
You can put your /home on a different BTRFS subvolume and exclude it from being snapshotted.
Oh, I guess that’s slightly better. At least this fucking idiocy didn’t make it into, essentially, law. But it also means that Nintendo (and other corpos) will not stop suing people left and right.
At what point will they sue fucking computer manufacturers, I wonder? Clearly, the ability to run unsigned code facilitates creation of code that’s illegal (such as DRM circumvention tools and fucking Nintendo emulators), which, in turn, obviously facilitates piracy of Nintendo games! Poor Nintendo is loosing dozens of dollars because of those evil, evil computers which are clearly used for pirating their games and nothing else! This needs to stop!