I am currently working on a constraint programming problem that I find difficult to model.
Here is the definition of the problem:
Given a specification of a Boolean function f (x1, …, xn) in the form of a truth table to find a NOR circuit (meaning only NOR gates) that meets the specification that minimizes the depth (and in the Case of a deep tie with minimal size).
To simplify the problem, various assumptions are made:
• Only NOR gates with 2 inputs and 1 output can be used: more general
NOR gates with more inputs are not allowed (ie the fan-in of NOR.)
Gates is always 2).
• The output of a NOR gate can only be the input of a single gate:
Outputs can not be considered inputs of more than one gate (i.e.
Fan-out of NOR gates is always 1).
• In addition to the input signals of the Boolean function to be implemented, constant 0 signals can also be used as inputs to NOR gates.
Constant 0 signals as well as input signals can be used as often as desired
as needed as inputs to NOR gates. On the other hand, the circuit does
do not need to use all input signals. Similar is the constant 0 signal
does not need to be used if it is not needed. "
Since this is the first time I have encountered such a problem, I find it difficult to find "good" decision variables. Maybe I miss something trivial …
Do you have an idea / hint how to model this problem?