"Programming in Lua" taking the gas out of the current Large Language Model AI bubble without realizing it or trying to.
10.2 – Markov Chain Algorithm
Our second example is an implementation of the Markov chain algorithm. The program generates random text, based on what words may follow a sequence of n previous words…
…After building the table, the program uses the table to generate random text, wherein each word follows two previous words with the same probability of the base text. As a result, we have text that is very, but not quite, random.
https://www.lua.org/pil/10.2.html
All that LLMs are doing is this same trick from Chapter 10.2, but scaled up massively. However it's still just generating random text, just with a high probability of it looking like text it's already seen. A regurgitating bullshit generator that often exceeds our ability to spot the randomness, but LLMs are just convincing liars.
And burning metric fucktons of fossil fuels. (
the real point of AI in 2025)