Skip to content

2025

Analyzing 2,200 Mods to Design an Archive Format For Modding

As part of the Reloaded3 project, I am designing a new archive format to serve as a container for game mods, both for distribution and loading of assets.

As discussed in my previous article on texture compression, this format is built to satisfy four key requirements:

  • Read-Only Virtual Filesystem: Make games read file from Archive as if they were on disk.
  • Efficient Distribution: Minimizing size and supporting streaming downloads.
  • Legacy Replacement: Capable of replacing native game archives with superior performance.
  • High-Speed Archival: Decompression speeds matching modern NVMe drives (GB/s).

To do this, I sat down and wrote a quick tool to analyze the existing mods that are out there. I wanted to look at all Reloaded-II mods (or as close as possible). The easiest way to do that was the Reloaded-II.Index, the same index where the built-in mod browser pulls its data from.

This resulted in a dataset of 2,197 unique packages; after excluding duplicates, etc.

Batching Tool Calls: Racing Against Cerebras Rate Limits & Too Fast Text Generation

This post talks about LLMs

I don't usually talk about AI or LLMs. Too much stigma, too many grifters, too much slop.

Let's be clear: LLMs cannot and should not replace you today - it is a tool (like any other) that can only aid you. The quality of your code is only as good as you are; LLMs lack good 'taste' and thus will produce slop by default. That goes for writing code equally as well as other uses. I say this as someone who has to clean up that slop daily, both mine and PR'd.

It's your responsibility to judge and produce good quality code at the end of the day. That includes ensuring that a dumb silicon-made machine doesn't output crap.

That said, AI can be a powerful tool to speed up work - if you know what you're doing.
This post is about documenting an optimization experiment, that's all.

Graph aside (38 lines), everything was (of course), written by me, Sewer, manually by hand.

Two months ago I ran an experiment to optimize LLM-based coding workflows at extreme speeds.

The findings were originally shared as a Discord post, but I never got around to posting them here. Today, with some free time, I've reformatted it as a more proper blog post with additional context.

A Program for Helping Create Lossless Transforms

Making the whole process easier with a tool.

Part 3 of Texture Compression in Nx2.0 series.

This one is a bit of a detour; in the previous parts we've talked about:

While the BC1-BC3 transforms were fairly simple and straight forward, formats like BC6h and BC7 massively increase complexity. Experimenting with different ways to transform them takes a lot of time.

To help with those formats, any many others, I built a tool for defining and comparing transforms.

Estimating Compressibility of Data & BC7

Will my transform make data more compressible?

Part 2 of Texture Compression in Nx2.0 series.

In my previous post in the series, I've demonstrated a recipe for making BC1-BC3 texture data more compressible, ~10% saving at a blazing ~60GB/s on the single thread.

That transform is usually beneficial, but there will be rare cases where it isn't.

With more complex files, such as files with multiple distinct sections, you may want to apply transforms on a per section basis. Or even skip individual steps of a transform.

But how do we know if a transform is beneficial or not?

Shrinking Old Game Texture Sizes by 10%, at ~60GB/s

On a single core, and a 4 year old machine...

Without a loss in quality.

Part 1 of Texture Compression in Nx2.0 series.

So I'm in the process of building an archive format (👈 WIP) suitable for game modding as part of the Reloaded3 project.

I aim for the following primary use cases:

  1. Read Only FileSystem: Something suitable in a reloaded redirector / usvfs style project.
    • Hooking NT / POSIX API calls to trick processes into reading files from another place.
    • Except we're reading from an archive.
    • If we load/decompress on all threads, we load data faster, using less disk space.
  2. File Downloads: Mods need fast file downloads.
    • Support streaming/partial download of archive contents.
    • Minimize file size.
    • User downloads less, mod site needs less traffic.
    • Everyone is happy.
  3. As a game archive format: Replace old games' native archives with the new format.
    • Through hooking, we can replace a native archive format with our own.
    • For size, performance, and just because you can.
  4. Medium Term Archival: I want to save some mods on disk for future use.
    • Basically as a general purpose archive format.
    • For non-Reloaded3 software, to archive currently disabled mods to save disk space.
    • The archive format extracts data so fast, extraction is as fast as your storage drive.
    • So like 8GiB/s on a modern NVMe.

The last one, mainly to seek adoption.

Today, we're going to be unraveling one of the tricks to achieve one of these goals; faster texture loads and smaller sizes via texture transformation.

I'll speedrun you through the basics of the simplest (BC1 / DXT1) transform, and how you can write similar transforms to improve compression ratios of known data.

I'll keep it simple, no prior compression theory/experience required.

As an appetizer, quick demo follows below.