From AI Hype to Markov Chains: A Return to Basics

2025-09-24
From AI Hype to Markov Chains: A Return to Basics

The author recounts their journey through four stages of the AI hype cycle concerning large language models: initial amazement, subsequent frustration, persistent confusion, and ultimate boredom. Tired of the constant stream of new models, the author decided to return to fundamentals and explore Markov chains. The article details how to build text autocompletion using Markov chains, covering the construction of transition matrices, probability calculations, and application to text generation. This piece not only explores the principles of Markov chains but also reflects the author's reflections on the current state of AI development and their desire to explore more foundational technologies.

Read more
AI

Local-First Software: Scaling Without the Headache

2025-07-05

Harper, a local-first grammar checker, experienced a massive user surge after hitting Hacker News' front page. Unlike server-dependent software, Harper runs on the user's device, eliminating server load concerns. Even with the user influx, there were no hiccups or latency issues. This highlights the scalability advantage of local-first software, avoiding the high costs of server maintenance and complex cloud architectures.

Read more
Development server load

Prompting LLMs in Bash Scripts: The ofc Tool

2025-03-02
Prompting LLMs in Bash Scripts: The ofc Tool

A new tool, ofc, simplifies integrating Ollama LLMs into bash scripts. It allows for easy system prompt swapping, enabling comparison of model behavior across different prompts. The author demonstrates its use in generating datasets for testing Harper and even having the LLM generate its own prompts for deeper analysis. Installation is straightforward via cargo.

Read more
Development Bash Scripting