compiler – How do binary numbers interact with the CPU and cause some action to take place?

EDIT: Perhaps what I am misunderstanding is that when it is said that the code we type gets turned into machine code of 0s and 1s. If these 0s and 1s are the abstracted representation of their electric states, then I actually don’t have a question (it is just amazing that a compiler can take English and turn it into a form that can impact the processor). If the compiler turns the code a file that contains literal 0s and 1s, then I don’t understand the transformation steps steps that strictly occur between this file of literal 0s and 1s and the execution of the program (I understand how programs already in RAM are executed).

I have been researching for an answer to this question all day. I have read a lot about the computer’s hardware and understand concepts like the clock, its different states that it “steps” through to execute a program, and (at a basic level) the compiler. I also know that the bits of each part of an instruction code are wired in a certain way so that it does what it is supposed to and the compiler takes the code we write and converts it to binary. But my question is this: how does the processor “understand” this binary (or machine readable) code.

In the computer there are no 0s and 1s obviously, and 0s and 1s are an abstraction themselves reprenting the presence or absence of electricity. So how does a 0 go in one end of the computer and a bit being off come out the other? I understand how this may work with input from the keyboard because I understand how computer peripherals work and get stored to some degree (the wiring underneath the keyboard accomplishes this as an extension of the wiring of the hardware that accomplishes everything that the computer does), but the compiler and the other (seemingly missing) piece of the puzzle that moves the computer to action seem to be a black box right now. Thanks so much in advance, always appreciate any responses.

One last time just to be super clear: I understand the compiler takes our code and converts it to 0s and 1s. That’s great, but how does this 0 get interpreted, by a component and/or a process, that causes this abstraction (like the #0) cause one of the computer’s bits to be in a different position.

Thanks so much once again. I have searched for hours about this question on this forum and other places and I can’t seem to find an answer that moves past the: “yeah the compiler converts to 0s and 1s and those do some stuff”. The how is what I am after.