![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/1bba6c11-24d0-4f37-a37d-2617cbb50d75.png)
Wow, instant nostalgia from that tea tin. In the Netherlands these were definitely a thing in the last century. I don’t know where they came from but everyone had them. I’ve ordered some (they’re pretty cheap).
Wow, instant nostalgia from that tea tin. In the Netherlands these were definitely a thing in the last century. I don’t know where they came from but everyone had them. I’ve ordered some (they’re pretty cheap).
Can I pet
Great, finally a CAPTCHA that I can run on my electric toothbrush
This cannot possibly be legal in Europe.
deleted by creator
What happened that they screamed at you even before an interview?
How do you know?
Was that related to a company campaign?
What are you referring to?
Yeah I remember a few years ago when there was suddenly some sort of hype around Nutella and Nutella-based memes. Which just so happened to coincide with a major Nutella advertising campaign on other platforms.
People were eating it up and generating content, essentially doing an advertiser’s work for free 🤷♀️
They switched off image generation after these issues, so it (correctly) said that it couldn’t generate images at the time.
Ooh, yes! A game without words, but incredibly beautiful.
Thank you! Yay, I just got access :)
Hello fellow developer with ADHD! This sounds perfect, exactly what I’ve been searching for. Most apps were not gamified enough, or quickly devolved into some manipulative microtransaction slot machine.
I’ve signed up as monthly supporter and am now eagerly awaiting access to the beta :) Can’t wait to try it.
What’s the weather like in False
?
Thank you for explaining. I work in NLP and are not familiar with all CV acronyms. That sounds like it kind if defeats the purpose if it only targets open source models. But yeah, makes sense that you would need the actual autoencoder in order to learn how to alter your data such that the representation from the autoencoder is different enough.
Fair enough. They only have to convince the self help books crowd 🙃
Yeah, if you already have it then it’s not really an extra cost. But the smaller models perform less well and less reliably.
In order to write a book that’s convincing enough to fool at least some buyers, I wouldn’t expect a Llama2 7B to do the trick, based on what I see in my work (ML engineer). But even at work, I run Llama2 70B quantized at most, not the full size one. Full size unquantized requires 320 GPU vram, and that’s just quite expensive (even more so when you have to rent it from cloud providers).
Although if you already have a GPU that size at home, then of course you can run any LLM you like :)
In the Netherlands, we have a political party to represent them. It’s called the Partij voor de Dieren (Party for the animals) and they’re a green left wing party.