In Computer Science when do you learn the fundamentals of high level languages and the methodologies a compiler uses to create assembly instructions? What is the primary book used for this course? Like, if you’re using Ada or Ghidra and trying to piece together what is happening in binary execution, I want to know structures to look for on this basic level.

I’m asking about the simple stuff like what you find in Arduino sketches with variables, type declarations, branching, looping, booleans, flags, interrupts etc. Also how these might differ across architectures like CISC/RISC, Harvard/von Neumann, and various platform specifics like unique instruction set architecture implementations.

I have several microcontrollers with Flash Forth running the threaded interpreter. I never learned to branch and loop in FF like I can in Bash, Arduino, or Python. I hope exploring the post topic will help me fill in the gap in my understanding using the good ol’ hacker’s chainsaw. If any of you can read between the lines of this inquiry and make inference that might be helpful please show me the shortcuts. I am a deeply intuitive learner that needs to build from a foundation of application above memorization or theory. TIA

  • solrize@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    8 months ago

    For reverse engineering you probably have to study some assembly output of compilers since the methods of implementing stuff like C++ vtables can be a bit intricate. There are also some books you can read. This bundle has one that I haven’t looked at:

    https://www.fanatical.com/en/bundle/effective-cybersecurity-prevention-bundle

    There are decompilation tools that can recognize some of that stuff automatically too (idk if Ghidra does that).

    Handwritten asm code will generally not look like compiled code. The Forth interpreters you are looking at are probably a good place to start.