algorithms – What are the limits of Boolean Algebra?

Any decision problem algorithm can be represented as a boolean expression. The rules of boolean algebra (De Morgan’s law, distributivity, etc.) can be used to manipulate and simplify that expression, similar to normal algebra.

  1. Can any boolean expression be derived to it’s most simple form (i.e. has the least number of NOT, OR, and AND symbols) solely through these manipulations?

  2. If the answer is yes, then why would that not be a foolproof way to find the most efficient form of any desired decision problem algorithm? For example, take some naive, slow algorithm for the Prime Decision Problem for numbers with 64 bits. Why could you not just start applying manipulations on that boolean expression until you have what you know is the most efficient algorithm?

I apologize if my question is confusing, or any of my premises are wrong, I’m just a poor outsider.