Each program in a high-level language ("industrial") can be called a kind of Turing machine. I suspect that there is a universal algorithm for doing so (for example, one can take the Cartesian multiplication of the domains of all variables, and the resulting space may be the state space of the Turing machine, though dealing with computer-representative floats is possible.) A tricky example Is there such a general algorithm or system? Https://github.com/Meyermagic/Turing-Machine-Compiler is an example of the programming language for Turing Machine and for the transpiler that translates C programs into the language of Turing machines or some sort of Turing assembler language can be found at https://web.stanford.edu/class/archive/cs/cs103/cs103.1132/lectures/19/Small19.pdf But what about the other Can the Turing machine be rewritten in a high-level programming language that uses functions, function compositions, and higher order functions?
Of course, there can be infinite results in this conversion – from the naming of the variables and functions to the data structures, the content of the functions, etc. But there are metrics for the quality of the software code and maximizing such metrics. The result is a more or less clear answer to this problem.
Such a transformation is very much in the present context of reward mazines for learning reinforcement (eg, https://arxiv.org/pdf/1909.05912.pdf) – symbolic representation of the reward function (as opposed to a tabular or deep neural representation) current. Such a symbolic representation considerably facilitates the transfer of competence between different tasks and introduces the conclusion during the learning process. In this way, reward machines reduce the need for data and the need for learning time.
It can be said that extracting first and higher order functions are fairly difficult tasks, but this task is addressed by higher order meta-interpretive learning, e.g. https://www.ijcai.org/Proceedings/13/Papers/231.pdf.
So – are there any research trends, work, results, frameworks, ideas, algorithms for converting the Turing machine into a program in a high-level programming language (and possibly back)? I'm interested in any answer – be it through functional, logical or imperative programming.