DeepLang
Paradigm(s) | Declarative |
---|---|
Designed by | User:Hakerh400 |
Appeared in | 2025 |
Computational class | Turing complete |
Major implementations | Implemented |
File extension(s) | .txt |
DeepLang is an esolang invented by User:Hakerh400 in 2025.
Overview
DeepLang interpreter uses AI to deterministically generate a Python program that satisfies the user input (source code), then executes the program.
In particular, it uses model DeepSeek-R1-0528 via HuggingFace free API token. The source code of a DeepLang program is passed as user input. Parameters temperature
and top_p
are set to 0 in order to achieve total determinism. When the AI response is received, find all python code blocks in the response, but outside of the think
tags. If there are no python blocks, or more than one block, throw an error. Otherwise, execute that block as python code.
Examples
Hello, World!
Write hello world in python.
Add two numbers
Write a python program that asks user for two numbers and outputs their sum.
Truth machine
Write a python program that asks user for input. If the input is 0, output 0 and terminate the program. Otherwise keep printing number 1 indefinitely.
Self-interpreter
Write a python program that uses InferenceClient's chat_completion from huggingface_hub to remotely run model "deepseek-ai/DeepSeek-R1-0528" (don't change model name). Set temperature and top_p both to 0. Set max_tokens to 128 thousand. Do not add other parameters. The main function reads the user message from file "src.txt". When receive a reply from the server, do the following. First remove everything between "<think>" and "</think>" tags. Then locate python code that is surrounded by three backticks, then "python", then new line, then python code, then new line, then three backticks again. If the number of such blocks is not 1, throw an error. Otherwise, interpret that as python code. Do not add comments in the code.
Implementation
This python code is produced by the self-interpreter from the Examples section. In order to run it, add a valid token as a parameter to the InferenceClient
constructor.
from huggingface_hub import InferenceClient import re def main(): with open("src.txt", "r") as file: user_message = file.read() client = InferenceClient(model="deepseek-ai/DeepSeek-R1-0528") response = client.chat_completion( messages=[{"role": "user", "content": user_message}], temperature=0, top_p=0, max_tokens=128000 ) content = response.choices[0].message.content content_clean = re.sub(r'<think>.*?</think>', '', content, flags=re.DOTALL) code_blocks = re.findall(r'```python\n(.*?)\n```', content_clean, re.DOTALL) if len(code_blocks) != 1: raise ValueError(f"Expected exactly one code block, found {len(code_blocks)}") code_to_run = code_blocks[0] exec(code_to_run) if __name__ == "__main__": main()
Background
While some view AI-driven code generation as circumventing traditional programming practices, DeepLang repositions this approach as a deterministic programming paradigm: the AI model functions as an advanced program synthesizer, transforming high-level specifications into executable implementations. This parallels type inference systems—such as Hindley-Milner unification—where compilers deduce types without explicit annotations. Here, the AI acts analogously, inferring computational logic from abstract user input. Python was selected as the target language due to the observed proficiency of the underlying DeepSeek-R1 model in generating syntactically valid and functionally accurate Python code.
Computational class
DeepLang is apparently Turing complete. Here is a brainfuck interpreter:
Write a brainfuck interpreter in python.