Esoteric units of information

Usually, when measuring amounts of data, we use the bit or its quasi-SI orders of magnitude (though they're usually based on 1024 instead of 1000 (and we generally only use the positive orders) unless you're an idiot. Of course, there's also the binary equivalents (e.g. kibblebit), but we won't go into that), or alternatively the byte/octet and its orders. However, over the years, many different alternatives to the bit and byte have been proposed, some seriously and others facetiously. This article catalogues them, along with their translations to other units.

Background: the bit
The bit is the basic unit for encoding information in computers. It is based on base 2, or binary, which can be seen in the name- "binary digit". A single bit holds 2 possibilities, which are typically written as 1 and 0, though they can really represent any 2-choice scenario- 1/0, true/false, on/off, cats/dogs, nuclear annihilation/world peace, etc. A bit is represented in various ways by computers, most typically magnetization or presence of a voltage, but could also be represented by nearly anything- even billiard balls.

Bits are often grouped together, often in groups whose size is a power of 2 (1, 2, 4, 8, 16, 32, 64, 128, 256). A group of bits of size n can encode 2n possibilities- usually, these are numbers in the range 0..2n-1 or -2n-1-1..2n-2, but pretty much any meaning is possible.

Binary (boolean) logic and gates
Bits alone are practically- nay, literally- useless. In fact, bits with nothing to interpret them can literally encode anything without any way to figure out what they mean. To solve this problem, we have computers that actually interpret bits. Computers interpret bits using transistors, which encode things called "gates", which represent boolean logic.

A Boolean logic gate accepts a collection of bits as "inputs" and returns another collection of bits as "outputs". There is exactly 1 possible gate with 1 input and 0 outputs, or more generally. This is because, when you have 0 outputs, there is no way to encode any result.

For gates with 0 inputs and 1 output, there are 2 possibilities: the TRUE0 gate and the FALSE0 gate, which return 1 and 0 at all times, respectively. Usually, we don't use these.

The most typically used gates have i>0 inputs and o>0 (usually o=1) outputs. For i=1, o=1, there are 4 possible gates already, 1 of which is ever really used or is at all interesting:
 * The TRUE1 gate returns 1 regardless of its input.
 * The FALSE1 returns 0 regardless of its input
 * The IS gate returns its input
 * The NOT gate (the interesting one): Returns the logical negation of its input- . Technically, with nothing but NOT gates and OR gates, you can build any circuit (even better, NOR lets you do it alone).

Above these, we will only discuss interesting gates.

For i=2, o=1, there are 6 commonly used gates. These are:
 * AND: Returns 1 iff both of its inputs are 1, else returns 0. This is similar to intersection of sets, or logical conjunction of logical formulae.
 * OR: Return 1 if any of its inputs are 1, else return 0.
 * XOR, Exclusive-Or, or NEQ: Returns 1 if EXACTLY 1 of its inputs is 1. Sometimes called "NEQ" because it returns 1 iff its inputs are not equal
 * NAND: Negated AND gate.
 * NOR: Negated OR gate. Interesting because pretty much any gate can be built on it.
 * XNOR or EQ: Negated Exclusive-Or gate

trit
A trit, standing for trinary digit because it is based on trinary (though the correct term is "ternary"), is a data measurement that can hold 3 possibilities- often {-1, 0, 1}, {0, 1, 2}, or {true, false, null}. A single trit is equal to approximately 1.58496250 bits, or roughly 1.09861228 nats. An example of a language that uses trits is TriINTERCAL

nat (or ban or nit or nebit)
The nat is often used in physics for measuring information and entropy. It is, as the name implies, based on the base of the natural logarithm- that is, e ≈ 2.71828182.... Nats are useful because they form the natural unit for information entropy. This means that physical systems that normalize Boltzmann's constant to 1 practically just measure entropy (the thermodynamic kind) in nats, whatever that means.

Nat is interesting, as with many other esoteric data types, because it uses a non-integer (in fact, an irrational even) as its base. As it uses base e, it can encode roughly 2.71828182 combinations for every nat, and en combinations for every group of n nats. As data is based on logarithms and powers, you can use the natural logarithm to calculate amounts of nats that can be encoded more accurately, along with other cool things we assume. A simple table of nat information can be found at Esoteric units of information/nat table, because why not.

Natural logic and gates

 * section incomplete

circ
A circ is closer to a trit than it is a bit. It is analogous to the nat based on e, but it is instead based on pi. No real uses are known yet.

TIB and STIB
The TIB is equal to 1/2 of a bit- sqrt(2) possibilities.