![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://sh.itjust.works/pictrs/image/24b1e15c-f5b6-4a90-9369-d6cf1a7f1cac.png)
41·
3 months agoSome words on my screen will be about as meaningful as a Harry Potter book at best.
Lol, why would I try to convince you when you already stated you can’t be convinced
Some words on my screen will be about as meaningful as a Harry Potter book at best.
Lol, why would I try to convince you when you already stated you can’t be convinced
It’s one thing to just use the software, it’s another to open bug tickets that you expect the maintainer to prioritise. It’s free software, the maintainer doesn’t have to do anything for you. If they want tickets fixed with high priority, they should work something out with the maintainer.
I get that it’s not the main point of the article, but is she seriously considering that someone’s meal choices are good indicators of whether they’d make a good babysitter?
I mean, this is also a particularly amateurish implementation. In more sophisticated versions you’d process the user input and check if it is doing something you don’t want them to using a second AI model, and similarly check the AI output with a third model.
This requires you to make / fine tune some models for your purposes however. I suspect this is beyond Gab AI’s skills, otherwise they’d have done some alignment on the gpt model rather than only having a system prompt for the model to ignore