https://github.com/ezyang/tlparse – or pip install tlparse
Ed Yang’s Torch Logs Parser gets used a lot within Meta, where it has a bunch of extra integrations to make it even more helpful. Its still useful everywhere else too when working with torch.compile, particularly if approaching a more complex model that generates a lot of log output or trying to get a feel for performance issues.
Basic usage is just:
TORCH_TRACE=/tmp/my_traced_log python module.py
tlparse /tmp/my_traced_log/filename.log -o tl_out/ --overwrite
The result breaks down the log into a number of easier to consumer sections for different times the analysis restarted, graph breaks etc., and gives you chromium perf trace files for looking at performance.
