Black Horse Excellence

Webmodels — Lena !!exclusive!!

This is the story of how a single image defined the engineering constraints of the early internet and continues to haunt the ethics of dataset curation. At the University of Southern California’s Signal and Image Processing Institute (SIPI), assistant professor Alexander Sawchuk needed a high-contrast, high-detail image to scan for a colleague’s conference paper. The lab’s flatbed scanner (one of the first) was crude: 100 lines per inch, 6 bits per pixel.

In 2018, Nature and the IEEE officially discouraged the use of Lena. Computer Vision and Image Understanding banned new submissions using the image. Today’s web models (CLIP, DALL-E, MobileNet) are trained on billions of images from LAION-5B or ImageNet-22k. Lena is irrelevant for training. However, she remains the unit test —the minimal reproducible example. webmodels lena

But how did a glossy magazine photograph become the benchmark for —the algorithms that compress, stream, and recognize images on every modern website? This is the story of how a single

Scroll to Top