User:Salpynx/2-symbol-tm-conversion

From Esolang
Jump to navigation Jump to search

2-symbol Grill tag

This intends to be a straight conversion of User:ais523's Grill Tag 2-state 14-symbol strongly universal Turing machine to a corresponding 2-symbol machine.

Expectation

We can convert an n-state m-symbol machine (n, m) to (n', t) (where t < m), and the new symbol width is

   n' = m * n + ((n * m) - h) * (w - 1) + w * 2 * n

Reasoning

  • To recognise m read cases in n states, we need still need m * n states
  • To write out the n original outputs per state, we need an additional states per case per original state, excluding halt states
  • We also need to move the read head more steps at the end of each output-write to locate it at the next original symbol point, this adds, w * 2 * original states

(assuming the maximal situation where for each state we need to recognise every possible symbol, and write a new symbol in every case)

Optimise

We can optimise by reducing this maximal case by only converting the cases which are used in the target (n, m) UTM.

Let r be how many unique read cases we need to distinguish in the target machine. Hoping that r < n * m (strictly less than, rather than equal to, which would be the maximal case). And x the number of unique write cases, h the number of halts

   n' = r + (x- h) * (w - 1) + w * 2 * n

The Grill tag (2, 14) machine has 26 read symbols and 22 writes in its definition, and one halt state, so with r = 26, x = 22, h = 1, m = 14, and t = 2

n' = 105 (this is out by a small amount, I had to add an extra rule for the online interpreter.. whatever)

This is a considerably less exciting result than my original just-waking-up and missing important variables math of (12, 2) (dodeca duae Grill tag) which inspired the attempt.

Conclusion

Even if I've messed up slightly, n' is relatively large, but the original specification of the Grill tag (2, 14) UTM could be feed through a pretty simplistic converter script to reduce the symbols to 2 and multiply out the states to read and write binary 4-bit words ().

Symbol-reduction Conversion state count

def conv(n, m, h=1, t=2):
    # n states, m symbols, assumes a strongly universal UTM with 1 halt state
    r = n * m  # assume maximal read cases
    w = ceil(log(m, t))  # word width of orig. symbols in new symbols
    return r + ((n * m) - h) * (w - 1) + w * 2 * n


A Grill tag to 2-symbol conversion seems in line with the strongly universal 2-tag simulating (3, 9) machine from the Woods and Neary (2009) survey.

  • (2, 14) -> 125 (this is the maximal case, real value will be a bit lower)
  • (3, 9) -> 129

Now that I have a rough but working converter (see link below), the (2, 14) UTM becomes a (107, 2) machine, that at least runs the Grill tag example program identically. Why is does this have so few states compared to what I predicted? Either the calculation is wrong, or there should be a greater expectation that a specific (n, m) machine will not use all of the potential combinations it could.

Comparing to other strongly universal machines from the survey:

  • The (4, 6) 2-tag simulator converts to (94, 2)
  • (5, 5) -> (103, 2)
  • (7, 4) -> (83, 2)
  • (10, 3) -> (99, 2)

The best conversion is the (6, 4) bi-tag simulator, -> (71, 2)

bi-tag simulators seem to convert to fewer states than nearby 2-tag simulators, but are less time efficient, so we'll take a 2-tag simulator over a bi-tag if the state counts are almost the same.

None of these conversions even get close to the survey listed (19, 2) 2-tag simulating UTM with only two symbols (who created it? where is it specified?). This is what we should look at if we are interested in small 2-symbol strongly universal Turing machines. (I have temporarily forgotten why I cared about 2-symbols in the first place. However, there does not appear to be a shortcut for generating interestingly small ones from other m-symbol machines -- as expected and pointed out in IRC) Figuring out what is meaningful about the resulting state count when reducing to the same symbol base might be an interesting next step. Extra head readjustment move states are an obvious inefficiency suffered when reducing the symbol count and therefore increasing the meaningful-word size.

External links