my dumbass read
Cook yourself
I presume you have experience with these traps?
(≧ω≦)
I mean I agree with most of them but bri’ish?? Never!
This looks exactly like an image that would accompany an SCP article lol
Sure! You’ll probably want to look at train-text-from-scratch in the llama.cpp project, it runs on pure CPU. The (admittedly little docs) should help, otherwise ChatGPT is a good help if you show it the code. NanoGPT is fine too.
For dataset, maybe you could train on French Wikipedia, or scrape from a French story site or fan fiction or whatever. Wikipedia is probably easiest, since they provide downloadable offline versions that are only a couple gigs.
tech support?? that’s going a bit far I think
ok but if everyone would accept me when coming out I think it would be worth it…
kid called EU anticompetitive laws:
The technology of compression a diffusion model would have to achieve to realistically (not too lossily) store “the training data” would be more valuable than the entirety of the machine learning field right now.
They do not “compress” images.
can I interest you in eepy mode Tuesday?
Are you using SDXL? If you are, you need to set the resolution to 1024x1024
I dunno. Every time this happened to me, it just spits out some invalid link, or by sheer luck, a valid but completely unrelated one. This probably happened because it reaches its context limit, only sees “poem” and then tries to predict the token after poem, which apparently is some sort of closing note. What I’m trying to argue is that this is just sheer chance, I mean you can only have so many altercations of text.
based. catgirls ftw
I choose to press. What did she do wrong?