unity – How to evaluate a binary expression tree in hlsl without recursion or a stack

I’m currently working on a dual contouring implementation for which I want to create procedural terrain based on layers of noise. Both, the terrain generation and the mesh creation via dual contouring run on the GPU in compute shaders.

For configurating the terrain generation I previously changed the specific compute shader’s source code itself to generate different layers of FBM noise and combine them with various CSG operations (e.g. union, intersection, difference) to arrive at the final terrain. But this offers very little flexibility, e.g. I cannot change the terrain generation at runtime.

So in order to get more flexiblity for configuring the terrain generation, I’ve started implementing a graph tool (similiar to ShaderLab) using XNode:
enter image description here
Red (leaf) nodes are operands, grey (internal) nodes are operators (either binary or unary) and the green (root) node is simple the output node. On the GPU side each operator (internal node) equals a function, e.g. a function that creates the output of the noise node.

The idea is that I can visually create a binary expression tree (with additional unary operators), upload it to the GPU using a StructuredBuffer<NoiseGraphNode> where NoiseGraphNode is

struct NoiseGraphNode
{
    uint leftNodeIndex;
    uint rightNodeIndex;
    uint nodeType;
    uint dataIndex;         // Index used in conjunction with nodeType 
                            // to access a node's data located in a 
                            // nodeType-specific StructuredBuffer, e.g. 
                            // the noise parameters in case of a noise node.
};

and have that graph evaluated by the GPU’s compute shader for generating the noise that represents the final terrain. Normally such a graph would be evaluated using a recursive approach, something like:

evaluate(node) {
     if(node has children){
          left_val = evaluate(node->left);
          right_val = evaluate(node->right);

          // find operation symbol for node and use it as
          // val = left_val operation right_val

          return val;
     }
     else {
          return node_value;
     }
}

Pseudo-code taken from https://stackoverflow.com/questions/10769174/evaluating-expression-trees.

HLSL doesn’t support recursion though! Another way would be to emulate the recursive implementation using a stack and a while loop (after all recursion is leveraging the call stack). But creating a stack structure in HLSL like so

struct NoiseGraphStack
{
    uint4 buffer(20);
    uint count;

    void Push(uint number)
    {
        buffer(count++) = number;   // Doesn't work because an array reference 
                                    // cannot be used as an l-value.
    }

    float Pop()
    {
        count--;

        return buffer(count >> 2)(count & 3);
    }

    static NoiseGraphStack Create()
    {
        NoiseGraphStack stack;
        stack.count = 0;

        return stack;
    }
};

doesn’t work either because it requires the while loop to be unrolled which isn’t possible.

Note: The exact error message referred to in the code above is “array reference cannot be used as an l-value; not natively addressable, forcing loop to unroll.”

So is it possible to evaluate a binary expression tree without recursion or a stack? Can I perhaps somehow preprocess the necessary steps for evaluating such a tree on the CPU (where I can use recursion just fine) first and linearize them before I send them to the GPU?