Thanks! Isn’t it better to train the embeddng with the model I expect to use it with?
Thanks! Isn’t it better to train the embeddng with the model I expect to use it with?
I think Automatic1111 runs on most CPUs, but you need a lot of ram.
Cars are for unrefined masses. Men of culture prefer trains and piblic trasportation.
There is a pretty big FuckCars community on Lemmy.
WallStreetBets
Automatic1111 is amazing, isn’t it?
I want to try it out before making a major investment.
Awesome!
Auto1111 might be trying to load multiple models at the same time, which it does not have room for.
SDXL is very memory hungry. Most base models are around 6-7 GB, which doesn’t leave much room for anything else.
Not suprising. Linux is usually faster, that is whe backend of every internet service uses Linux.
Telegram is a suprisingly good app.
I wish other apps were half as good as Telegram.
Pipewire is amazing. Linux had issues with Bluetooth audio that Pipewire finally fixed.
Yo ho yo ho 🏴☠️🏴☠️🏴☠️🏴☠️
One could make the case that the loss of West Virginia is a benefit in itself.
It’s in development. I think there is beta functionalty already that you can try.
No I have not. Let me give it a try thanks.
Is that why India has the best programmers?