torch.autograd.executeBackward
Execute the backward pass using proper topological ordering.
This is the core autograd engine that ensures gradients are properly accumulated before propagation. It implements the same algorithm as PyTorch:
- Build the graph and compute dependency counts
- Use a ready queue to process nodes in topological order
- Accumulate gradients in input buffers until all dependencies satisfied