PNG 5th Edition Roadmap

TPAC 2025

These slides can be found at programmax.net/talks/png-5th-edition-roadmap

Thank you

  • W3C
  • Hakim El Hattab, for reveal.js (slide software)
  • Sandflow Consulting LLC, for the Pareto front image
  • Brooke Vibber, for parallel PNG diagrams

Copyright notices are below.

Goal?

  • What is PNG's goal?
  • When is it used?

Remember your answer for later.

General themes:

  • 3rd Edition—Bring it up to date
    • HDR, animation, …
  • 4th Edition—Compatibility
    • HDR image on SDR display
  • 5th Edition—Improve compression
A timeline of PNG spec development

Work on 4th and 5th Editions is being done in parallel.

Developers & users want better compression

Prior to 3rd Edition's launch, several people filed issues with ways to improve compression.

After 3rd Edition launched, we received lots of community feedback wanting improved compression.

Not only file size

(De)Compression speed is also important, which causes a Pareto front.

A graph of various image decoders showing file size compared to decode speed

Parallelization

Background

  • PNG uses the DEFLATE compression algorithm (often zlib implementation).
  • DEFLATE builds a history as it decodes
    • Cannot jump to the middle
    • Inherently serial
  • DEFLATE supports restart markers, which abandon history.

pigz

pig-zee is written by the original zlib author to add parallelization.

pigz header

mtpng uses the pigz approach for PNGs

Parallel PNG writing diagram

Notice, only 2 threads for decoding

Parallel PNG reading diagram

Small spec addition—N-thread decoding

New PNG chunk

A new PNG chunk tracks DEFLATE restart marker positions.

Threads start decoding at the restart markers.

  • Cheap
  • Easy
  • Backwards compatible

Area/Region of Interest

Large images (think satellites) benefit from this, too.

Viewing a 1Kx1K region of a 10Mx10M image is faster with carefully placed restart markers.

Open questions

  • How small / large should a piece be?
    • Too small = compression impacted
  • How fast are decodes?
    • Fast decoding = fewer threads in flight
    • Slow network = threads unutilized

Use cases

  • Browsers process packets upon arrival.
  • Non-browser programs ~always load the entire file before processing.

Research to answer the questions

A GitHub page for FileReadSpeedTest

Research to answer the questions

A screenshot of ImageInternals

Existing carve-outs

Add other compression methods

From the PNG spec, Other values of compression method are reserved for future standardization.

Research needed

  • How do those compressors compare?
  • Is the implementation burden worth it?

Radical changes

Recall users want better compression

Everything so far is good. But it might still underwhelm.

These would likely put us in close contention with WebP.

To win hearts & minds of end users, I want to do better.

Compression is really 3 things

  • Compression algorithm
  • Tuning the data to the compressor
  • Tuning the data to the end user

PNG is well known for being lossless

  • Only 2/3rds of those benefits
    • Cannot tune to end user
  • File size will never compete
  • Web is often final presentation (end user)

PNG's goal?

Remember earlier, when I asked you what is PNG's goal and when is it used?

Is PNG an archival format? Lossless makes sense.

Is PNG an end user web format? Lossy makes snese.

It can be both

WebP and JpegXL support both lossy and lossless.

The way they do lossless is nearly identical to how PNG works.

PNG's uses?

For PNG's uses, did you think logos and icons?

Do those really need to be 100% lossless?

Many lossy formats target photos. There is gap.

Too different? New format?

If PNG is known as lossless, perhaps this fits into PNG2.

Q&A