Speaker
Description
Graph pooling is an active research area, but comparing pooling operators remains difficult when each method comes with its own interface, tensor conventions, losses, and batching assumptions. Torch Geometric Pool (tgp) is an open-source library built on PyTorch Geometric to make these comparisons easier, more reproducible, and less dependent on one-off experimental code. It provides a shared interface for 20 hierarchical pooling operators, standardized output objects, reusable readout modules, and workflows for dense execution, caching, and pre-coarsening.
This poster will focus on tgp as research software infrastructure for open benchmarking. By reducing a change of pooling method to a configuration-level choice, tgp enables researchers to compare alternatives under the same model, training loop, logging logic, and evaluation pipeline. The library is distributed through GitHub and PyPI, documented with tutorials and examples, and designed so new operators can be added without rewriting the surrounding ecosystem. The poster will discuss how open code, common APIs, and reusable benchmarks help make graph neural network research easier to reproduce, extend, and review.