• 0 Posts
  • 100 Comments
Joined 2 years ago
cake
Cake day: July 5th, 2023

help-circle
  • I’ve been trying to research the various glitches and variations between versions because I’m working on something that uses some undocumented features and precise timing. Unfortunately, I don’t have one good link that explains it well.

    The issue stems from how player objects (the 2600 equivalent to sprites) are placed horizontally. For good and interesting reasons which are also technically involved and complicated, programmers can’t just send an X value to the graphics chip. Instead there’s a two-step process. First, the program sends a signal to the graphics chip when the TV raster is at approximately the desired horizontal position on the screen. Then, because it’s often not possible to nail the timing of that signal to the exact pixel position, the graphics chip has a facility to “jog” the various graphical objects left or right by a very small amount at a time.

    According to the official programmers’ documentation, this final “jog” should only be done at one specific time during each video scanline. If we only do it this way, it works correctly on pretty much every version of the console. However, doing it “correctly” also introduces a short black line at the left side of that scanline. If we instead send the “jog” signal at certain other times, no black line appears. Additionally, the exact distances moved change depending on when we send the signal, which can be worked around or are sometimes even beneficial.

    Kool-Aid Man uses these undocumented “jog” timings, as several games did. But it displays a score counter at the top of the screen by using the player objects placed very close together. It seems that the console versions in question (later 2600 Juniors and some 7800s) are more sensitive to the timing being used, as you can sometimes see the parts of the score flicking left or right by one pixel.

    The Atari 2600 also has a hardware collision detection system, which reports when any two moving screen objects overlap with each other or the background. Once a collision occurs, the relevant flag will stay set until the program clears it. Kool-Aid Man uses this system to detect when the player character touches enemies. But the program only clears the collision flags once, at the bottom of each frame, and the same player objects are used to draw the score. So when the two parts of the score flicker into each other, it registers as a collision between player objects, which the game interprets as a collision between Kool-Aid Man and a Thirsty.

    As you mentioned, I’ve read that setting the console switches a certain way can prevent this issue, but I’m not sure why. My guess is that setting some switches one way rather than another causes a conditional branch instruction that checks the switches to branch rather than fall through, which takes one extra instruction cycle (or vice versa), which is then enough to stabilize the score display and stop the parts from colliding.


  • There’s a… not exactly a bug, but an unannounced change, in the graphics chip in some later versions of the Atari 2600, which has been named after this game by the fan/homebrew community. On most 2600 console versions, it’s possible for a game to perform a particular graphics operation at an unintended time and get an undocumented but consistent and useful result.

    On the differing consoles, the result is slightly different, and because of the way this game is written, it often causes a chain of actions that end up making Kool-Aid Man bounce around continuously as if being hit by enemies, even though nothing is touching him.



  • I once saw a documentary about Bedouin tribes that were dying out. The problem was very simple, from the outside; they were killing virtually all of their female children.

    The team interviewed an elder of one tribe, asking him about this practice. As expected, the elder said that parents wanted sons to continue their family names.

    “If no-one in the tribe has any daughters, where will these sons find wives?” asked the interviewer. The elder confidently replied without hesitation, “They will get wives from other tribes.” “But what if the other tribes kill their female babies just like your tribe does?” the interviewer persisted (In fact, they had met people from several tribes, and indeed they all followed this terrifying practice). The elder looked at the interviewer like he was a slow child. “They will get wives from other tribes.”


  • “NPM install” isn’t going to be the direct result of a race condition in JavaScript. And while I’m not familiar with Python, I’d guess that an “Indentation error” wouldn’t be one either. A missing library or syntax error that’s only discovered by executing a particular branch is still just a missing library or syntax error, not a race condition.

    Also, while Node.js is popular, it isn’t an integral part of JavaScript in the way that the other errors are integral to their respective languages.


  • For 2, one of the few pieces of Windows software that I haven’t been able to replace in Linux is GetRight. Many HTTP servers support downloads starting at an offset from the beginning of the file, and GetRight uses that to allow download pausing and resumption.

    It was a real life saver back when I had an extremely flaky Internet connection.

    EDIT: Thanks for all the suggestions, I’ll definitely take a look at them. Simply resuming downloads is why I initally started using GetRight, but it also came with a bunch of other useful tools that I came to rely on. While I’ve been able to replicate some of the basic functionality with individual browser plugins or programs, I haven’t seen anything that integrates it all so well, with such a smooth interface. I haven’t looked for a long time, though, so maybe one of your suggestions will be the one!



  • Redkey@programming.devtoLinux@lemmy.mlVirus
    link
    fedilink
    arrow-up
    2
    ·
    14 days ago

    Before everyone had Internet at home? Well, there were bulletin boards, but even without those? Yeah, swapping floppies was how they got around. I got hit a few times as a teen, but the worst one actually came from a legitimate copy of a game I bought secondhand. It got into the boot sector and I had to nuke the HDD from orbit to get rid of that one. I’m just glad that software BIOS updates weren’t a thing yet.




  • I recently wasted multiple evenings going through this with my partner’s photos on both OneDrive and Google. It was a nightmare, trying to disentangle their systems from the cloud, and delete stuff from the cloud (they were hitting the free quotas, which was causing problems) without also deleting that content locally.

    I ended up doing a full backup from the cloud to an external drive and unplugging it just to be sure, then carefully using the awful web interfaces to delete a bunch of photos and videos from the cloud after deactivating all the auto-backup “options”, which is apparently the only way to do it without also wiping your local media. There doesn’t seem to be any way to do it while using the “service” normally on the device; any attempt to delete from the cloud will also delete your local copy.

    People have called me paranoid for seeking out and removing/deactivating these “services” with extreme prejudice on my own devices, but this experience was even worse than I’d imagined.


  • Assembler, BASIC, Old C code, Cobol…

    …Pascal, Fortran, Prolog, Lisp, Modern C code, PHP, Java, Python, C++, Lua, JavaScript, C#, Rust…

    The list is infinite.

    Show me a language in which it is impossible to write spaghetti code, and I’ll show you someone who can’t recognize spaghetti code when it’s written in one of their favourite languages.




  • In order to make the game small enough to fit on a cassette tape they had to ditch basic and program the entire game, world in assembly.

    Putting aside the fact that the majority of commercial games of the time were written in assembly (or other low-level languages) just as a matter of course, I strongly suspect that programming the game in assembly was an execution speed issue, and not a cassette space issue. Regular audio cassettes easily held enough data to fill an average 8-bit home computer’s memory many times over, whether that data was machine code or BASIC instruction codes.


  • MDN is great, especially for finding current best practice, but I’ve always found their material much more useful for reference once I’m already familiar with the general usage of whatever I’m trying to use. I often find it difficult to get to grips with something new just with MDN.

    I usually go read W3Schools first. It’s mostly a bit out of date, but not so much that it’s useless, and I find the tutorials much easier to digest. Once I’m comfortable with the basics, I switch to MDN to get up to speed.

    And OP, it sounds like you’re already wary of this, but don’t let yourself be tricked into using a hodge-podge of libraries for every little thing. A lot of JS programmers will tell you that you “need” some library or other to provide a function that you can often replicate with just two or three lines of raw JS, if you know what you’re doing.

    I think the JS library addiction stems from the bad old days of browser incompatibility, when almost everything had to be wrapped in layers of complex compatibility shims.



  • Maybe instead of usernames, the instances could store/trade… salted hashes of the usernames where the salt is the title or unique identifier of the post/comment being voted on?

    I didn’t have time to reply earlier, but I was thinking the same thing, except with the extra step of replacing the username with a unique user identifier randomly generated at signup by the user’s instance and kept secret.

    I wonder if there’s a way to prevent people from even knowing that two different votes came from the same user.



  • I think it depends a lot on a person’s individual knowledge. If you keep studying far enough away from your main area of expertise, there’ll still be some point where you stop and have to blindly accept that something “just works”, but it will no longer feel like that’s what your main field is based upon.

    Imagine a chef. You can be an OK chef just by memorizing facts and getting a “feel” for how recipes work. Many chefs study chemistry to better understand how various cooking/baking processes work. A few might even get into the physics underlying the chemical reactions just to satisfy curiosity. But you don’t need to keep going into subatomic particles to have lost the feeling that cooking is based on mysterious unknowns.

    For my personal interest, I’ve learned about compilers, machine code, microcode and CPU design, down to transistor-based logic. Most of this isn’t directly applicable to modern programming, and my knowledge still ends at a certain point, but programming itself no longer feels like it’s built on a mystery.

    I don’t recommend that every programmer go to this extreme, but we don’t have to feel that our work is based on “magic smoke” if we really don’t want to.

    ADDED: If anyone’s curious, I highly recommend Ben Eater’s YouTube videos about “Building an 8-bit breadboard computer!” It’s a playlist/course that covers pretty much everything starting from an overview of oscillators and logic gates, and ending with a simple but functional computer, including a CPU core built out of discrete components. He uses a lot of ICs, but he usually explains what circuits they contain, in isolation, before he adds them to the CPU. He does a great job of covering the important points, and tying them together well.