Logs

we can improve the accuracy of nearly any kind of machine learning algorithm by training it multiple times, each time on a different random subset of the data, and averaging its predictions Fastbook...

I wanted to get more hands-on with the language model trained in chapter 12 of the FastAI course, so I got some Google Colab credits and actually ran the training on an A100. It cost about $2.50 and...

The following code allowed me to successfully download the IMDB dataset with fastai to a Modal volume:

Jon wrote an interesting blog on top of Cloudflare Workers and KV.

Ran several experiments using local LLMs (~7b parameter models) like llama3.2 and phi3 to generate a random number between 1 and 100. The exact prompt was

Reading a bunch. Also got inspired to play around with generating random numbers with language models across different temperatures to see their distributions.

There is an enormous amount of jargon in deep learning, including terms like rectified linear unit. The vast vast majority of this jargon is no more complicated than can be implemented in a short...

Reading fastbook, I get the sense we could teach math more effectively if we did so through spreadsheets and Python code.

Current theory on why nbdev and notebooks in general make sense and can work: Writing code for most software is actually pretty similar to writing code for models, but usually you pay less of a cost...