• 5 Posts
  • 1.42K Comments
Joined 2 years ago
cake
Cake day: July 3rd, 2023

help-circle

  • The UDR and UDM line are so horrifically poorly designed it’s frankly astonishing they ever left the drawing board. Not only is the shape unwieldy and awkward enough on its own, the thermals on those things are terrible. I used to wake up to no internet every few weeks because my UDM would overheat and would be unable to boot until you cooled it down. I learned to just stick it in the fridge for a bit to cool it down so it could boot again.

    The UniFi UCG ultra and the Max are far better machines, capable of doing just as much if not more than the Pill shaped routers while staying cool and not sounding like a jet engine every time you send a text over wifi. The only thing you even lose is the integrated Access Point which sucks compared to the discrete ones anyway.












  • You call it “tedium”, the developers and many classic FO enjoyers called it “immersion”, “living world”, and “fun”.

    Wasteland 2 came out 16 years after Fallout, so naturally they aren’t really peers and their design philosophy will be a lot closer than to modern games in that they’re more forgiving.

    Wasteland was more of a predecessor to Fallout 1, as the developers were big fans of it and they thought of Fallout as a spiritual successor to Wasteland. Fallout was also designed to be far more punishing in its early game with a steeper power curve, and had a higher focus on the player being a singular, fish out of water character, rather than a capable party like in Wasteland. They also wanted to put more pressure on the player, hence other mechanics like the time limit.

    I also faintly recall the creative director of Fallout 1 talking about replaying Wasteland more recently and mentioned needing some kind of limiter to play it because of some issue with movement and other calculations being tied to CPU and/ or FPS. So it’s possible Wasteland has a similar issue, though I wouldn’t know as I’ve never played it.


  • They didn’t intend for it to be based on clock speeds, they were bound by it. Your subjective opinion and personal taste is what made you feel like you got the correct encounter rate, not developer intention, which as we’ve discussed, would be impossible.

    Like what I think you don’t get is that it’s ok that you prefer an encounter rate lower than what the devs intended. They wanted the world to feel dangerous and hostile, and gave you the option to alleviate the encounters through acquirable items and skill point allocations. You prefer the scripted content and want the random encounters to stay out of your way for the most part.

    The old Fallout games were meant to feel punishing, to a sometimes unfair degree. That was the style at the time and you’d be surprised just how many games were like that. It was a different time. To circle back, that’s why there is in fact so much debate over all these games. People like different things and the Bethesda games are far, far more forgiving than the originals. Thats why some people like you play the classic games and enjoy the lower encounter rate, and other install restoration mods to restore the higher one.


  • If random encounters go down as CPUs get faster, my CPU is so much faster than one from the 90s that my random encounters should approach zero, but I had plenty.

    I mean some napkin math and averages would tell me that your base clock speed is roughly 8 times faster than the fastest computers they would have tested on. Is 8 times faster truly enough to bring the random event rate to “near zero”? Problably not. And with an old game like this it’s not as easy as just comparing clock speeds because it depends on which CPU you have, do you have Ecores? If so is your computer scheduling it on those or your p cores? And in either case is it using base clock speed or boost clock speed? How do your drivers fit into all this?

    There’s also the fact that while the encounter rate is tied to CPU speed it’s not a 1:1 relationship either. The encounter system also factors in tiles, and in game days.

    that they built and tested the game on higher end machines than many of their customers had, and that faster CPUs resulted in the correct encounter rate while slower CPUs resulted in dozens.

    Like I’ve already said, they accounted for lower CPU clocks at the time. They designed the encounter rate for clock speeds between 200mhz and 450-500mhz, the whole range for the time. You’re also acting like fallout 1 wasn’t a cheap side project half made for free by people working off company hours. It wasn’t some big budget release. Or as if Fallout 2 wasn’t an incredibly rushed game shoved out the door by a financially failing company.

    I’d sooner believe that the game working differently at different clock rates was an oversight rather than how they intended for it to work.

    It was neither. It was simply an engine limitation they had to account for best they could because the first two games were functionally just tabletop RPGs under the hood that ran on a modified version of GURPS and relied on dice rolls for practically everything. As with anything else in life they designed around the problems they encountered at the time, not some hypothetical distant future scenario they’d have no way to predict.



  • If we ignore the part where that person had so many encounters that they came to the conclusion that something was wrong

    I wouldn’t ignore it at all, in fact, what they might even be experiencing is the games intended encounter rate which as I told you, is much higher than you think it is. A lot of modern Intel CPUs, especially in laptops, have efficiency cores besides their performance cores, and sometimes have insanely low base clock speeds, we’re talking as low as 200mhz. Given the games age, it’s very possible the game was scheduled on an E core and also wouldn’t boost the clock speed, resulting in the behavior they describe.

    if we ignore the distinct possibility that people remembering a higher encounter rate could have been experiencing that due to their CPU spec not being what the developer intended even in the 90s

    That’s not a possibility. The developers specifically designed the system with lower spec systems of the time in mind. They actually designed it in such a way that the encounter rate would be reasonable compared to their idea rate on systems with clock speeds as low as 200mhz (Just like our friend above).

    Now that user will be experiencing more encounters than even the average player in the 90’s, but it still wouldn’t be outside of the realm of what the devs decided was intended.