• 0 Posts
  • 41 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle





  • I’ve spent some time with the first three, so I can give my opinion on those.

    The FF1 remake is very different experience than the NES original. That version had a ton of minor bugs that gave that gave it a unique balance. Every subsequent remake, including the pixel remaster, has been an attempt to fix those bugs, and add modem QoL features, and then rebalance the game to try to keep the same feel. I think the pixel remaster is a good game, and comes closer to the feel of the original than some other remakes, but it is still a distinctly different experience. I’d characterize it as a different game wearing the same clothes.

    The FF2 remaster, on the other hand, is probably the best way to experience that game. The Famicom original is notoriously unbalanced and player-hostile, but those problems are effectively bypassed by the simple inclusion of two QoL features: a map, and a one-button autobattle. It took decades, but FF2 is finally worth recommending to more than hardcore fans.

    The FF3 remaster is in an odd situation, in that this is the first time a close approximation of the Famicom original is officially available outside of Japan. The DS remake from 2006 is a significantly different game, especially in the first couple of hours. I didn’t play as much of this one as the other two, but I can’t imagine it deviates too much in the later parts of the game. I would guess, though, that the more flexible save mechanics make the notoriously difficult final three dungeons much more manageable, though maybe more prone to soft-locking.




  • My wife and I had this conversation the other day. Our kid is only two right now, but as we’ve learned, these milestones sneak up on you.

    I used my own life as a guide to my opinion, and so landed on age eight or so. That’s around the age I remember being able to go to the park or to a friend’s house within the neighbourhood on my own.

    Other questions about how much functionality the phone would have and how much access they would have to it at home are still to be determined.



  • At least last time I donated blood in my country (Canada), you could discretely indicate “do not use” by applying a different sticker to the bag. This was done in case someone got peer pressured into donating but didn’t want to reveal something private that would have disqualified them otherwise.




  • You are welcome.

    Pointers do make more sense to me now than two decades ago, mostly owing to me being married to a computer scientist. But I always go back the fact that for the purposes of my first year programming course, pointers were (probably) unnecessary and thus confusing. I have a hard time understanding things if not given an immediate and tangible use case, and pointers didn’t really help me when most of my programs used a bare few functions and some globally defined variables to solve simple physics problems.

    EDIT: I’ll also say that pointers alone weren’t what sunk my interested in programming, they’re just an easily identifiable concept that sticks out as “not making sense.” At around the same time we had the lesson on pointers, our programs were also starting to reach a critical mass of complexity, and the amount of mental work I had to do to follow along became more than I was willing to put into it - it wasn’t “fun” anymore. I only did well on my final project because a friend patiently sat in my dorm room for a few hours and talked me through each step of the program, and then fed me enough vocabulary to convince the TA that I knew what I was doing.



  • I am but one man whose only education in programming was a first year university course in C from almost two decades ago (and thus I am liable to completely botch any explanation of CS concepts and/or may just have faulty memories), but I can offer my own opinion.

    Most basic programming concepts I was taught had easily understood use cases and produced observable effects. There were a lot of analogous concepts to algebra, and functions like printf did things that were concrete and could be immediately evaluated visually.

    Pointers, on the other hand, felt designed purely of and for programming. Instead of directly defining a variable by some real-world concept I was already familiar with, it was a variable defined by a property of another variable, and it took some thinking to even comprehend what that meant. Even reading the Wikipedia page today I’m not sure if I completely understand.

    Pointers also didn’t appear to have an immediate use case. We had been primarily concerned with using the value of a variable to perform basic tasks, but none of those tasks ever required the location of a variable to complete the calculations. We were never offered any functions that used pointers for anything, either before or after, so including them felt like busywork.

    It also didn’t help that my professor basically refused to offer any explanation beyond a basic definition. We were just told to arbitrarily include pointers in our work even though they didn’t seem to contribute to anything, and I really resented that fact. We were assured that we would eventually understand if we continued to take programming courses, but that wasn’t much comfort to first year students who just wanted to pass the introductory class they were already in.

    And if what you said is true, that later courses are built on the assumption that one understands the function and usefulness of pointers despite the poor explanations, then its no wonder so many people bounce off of computer science at such a low level.


  • I definitely feel this. I had to take a programing course in university and I was easily able to follow along up until the lesson on pointers, whereupon I completely lost the thread and never recovered.

    I’ve known a good number of computer scientists over the years, and the general consensus I got from them is that my story is neither unique nor uncommon.


  • I do not care how local you think the myth of Noah’s Flood was supposed to be, as that fact is immaterial to the point you continue to miss. That flood still would have killed innocent people, and the story frames this as a morally just action. No amount of quibbling over linguistics will change that.

    The amount of excuses needed to ignore the plain implications of a passage is really telling. One could take the Old Testament as it appears: a series of books written and edited (and redacted, and co-opted, and edited again) as the religious and cultural canon in the Iron Age for an otherwise obscure Levantine tribe, with morals from a different time and place unsuited to our modern sensibilities. There are many such books and traditions from all over the world that contain tales just as horrifying as any in the Old Testament, so it would not be without company.

    But the apologist wants us to believe that their ancient stories are actually true, and so they have to invent all these insane reasons why clearly immoral actions by their book’s main character are totally justified. This is the sort of position that can only come about when someone decides what they believe first and then looks for rationale afterwards.


  • You can’t even keep your own stories straight. The Great Flood myth in the Bible is very explicit that all life on earth will be destroyed, except that aboard Noah’s Ark. Genesis 7:23 (NIV):

    “Every living thing on the face of the earth was wiped out; people and animals and the creatures that move along the ground and the birds were wiped from the earth. Only Noah was left, and those with him in the ark.”


  • There is no reason to believe that Noah’s family were the only innocents in the Flood story. I do not know how one can pin the supposed hedonism of the world on all those young children who would have drowned.

    There is also no way to excuse killing the children of thousands of people because of the actions of one man. Blaming that one man for “forcing” supposedly omnipotent being to act in that way is also unjustifiable.

    And there is no way to shift blame for genocide by simply saying, “the underlings took it too far.” This excuse rings especially hollow when Jehovah asks for a cut of the spoils afterward (Numbers 31:25-31).