• 0 Posts
  • 49 Comments
Joined 1 year ago
cake
Cake day: August 26th, 2023

help-circle

  • The service providers are the ones who dictate the costs. They provide the infrastructure. The costs for these kind of transactions are much much lower because of economy of scale they handle millions of transactions per day across all their clients. Because they handle so many transactions they can charge a small percentage fee. The loss they make on small transactions they will make up with bigger transactions.

    While Steam uses a normal bank transactions to pay developers, because many of them are in the hundreds of thousands and some are in the millions of dollars so you don’t want to have a third party handling those that asks a percentage fee. You’d rather just pay the fixed fee the bank charges per transaction. Since it is cheaper for those large transactions. That fee can be $10-$20 especially on international transactions. That’s why Steam waits till that money is above a $100. And using a third party to handle those small transactions wouldn’t be worth the hassle. The percentage fee would be high anyway because of the low volume.









  • It’s not a bug. Just a negative side effect of the algorithm. This what happens when the LLM doesn’t have enough data points to answer the prompt correctly.

    It can’t be programmed out like a bug, but rather a human needs to intervene and flag the answer as false or the LLM needs more data to train. Those dozens of articles this guy wrote aren’t enough for the LLM to get that he’s just a reporter. The LLM needs data that explicitly says that this guy is a reporter that reported on those trials. And since no reporter starts their articles with ”Hi I’m John Smith the reporter and today I’m reporting on…” that data is missing. LLMs can’t make conclusions from the context.