Graphs - Gomory Hu tree - Memory blow up and max recursion depth [closed]

asked 2024-10-08 16:39:46 +0100

Telijas gravatar image

Hello,

I am using the graphs part of Sage and encounter a problem using the Gomory Hu tree function, using the alogirthm FF.
Having a graph with more than 1000 vertices triggers the python max depth recursion limit (which I can increase), but being at the max default recursion depth already consumes 17GB of memory. Scaling this linear to the example graph of 30k vertices would require ~500GB of memory, which feels unreasonable, given that Gomory Hu is "just" an iterated min cut algorithm. Can someone help me? Am I using the function wrong?

from sage.all import *
from datetime import datetime
import psutil

# Initialize graph and get starting memory/time
s3 = graphs.SierpinskiGasketGraph(Integer(10))
start_time = datetime.now()
process = psutil.Process(os.getpid())
mem = process.memory_info()[0] / float(2 ** 20)
print("Mem usage at start:", mem, "MiB")
try:
    # Compute
    print("Vertices found:", s3.order(), "and edges:", s3.size())
    # s3.edge_cut(Integer(1), Integer(1000), value_only=True)  # exchange this line for memory comparison
    s3.gomory_hu_tree(algorithm="FF")
except Exception as error:
    print("Error detected:", error)
finally:
    end_time = datetime.now()
    print("Runtime =", end_time - start_time)
    mem = process.memory_info()[0] / float(2 ** 20)
    print("Mem usage at end:", mem, "MiB")

prints:

Mem usage at start: 258.6015625 MiB
Vertices found: 29526 and edges: 59049
Error detected: maximum recursion depth exceeded while calling a Python object
Runtime = 0:05:13.276101
Mem usage at end: 17661.28125 MiB

The max recursion depth error looks like this:

Traceback (most recent call last):
  File "/home/telijas/Documents/python/projects/treewidth-playground/main.py", line 22, in <module>
    s3.gomory_hu_tree(algorithm="FF")
  File "/home/telijas/anaconda3/envs/pythonProject/lib/python3.11/site-packages/sage/graphs/graph.py", line 8592, in gomory_hu_tree
    g = self._gomory_hu_tree(frozenset(self.vertex_iterator()), algorithm=algorithm)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/telijas/anaconda3/envs/pythonProject/lib/python3.11/site-packages/sage/graphs/graph.py", line 8493, in _gomory_hu_tree
    gV_tree = gV._gomory_hu_tree(vertices & frozenset(gV), algorithm=algorithm)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/telijas/anaconda3/envs/pythonProject/lib/python3.11/site-packages/sage/graphs/graph.py", line 8493, in _gomory_hu_tree
    gV_tree = gV._gomory_hu_tree(vertices & frozenset(gV), algorithm=algorithm)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/telijas/anaconda3/envs/pythonProject/lib/python3.11/site-packages/sage/graphs/graph.py", line 8493, in _gomory_hu_tree
    gV_tree = gV._gomory_hu_tree(vertices & frozenset(gV), algorithm=algorithm)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

OS: Ubuntu 22.04.5 LTS
Sage: 10.4
CPU: 3.6 GHz single core

edit retag flag offensive reopen merge delete

Closed for the following reason the question is answered, right answer was accepted by Telijas
close date 2024-10-13 21:18:28.372986

Comments

We can certainly try to make a non recursive version of the method to avoid the max recursion depth issue. However, the memory consumption will remain high and I'm not sure we can reduce it.

David Coudert gravatar imageDavid Coudert ( 2024-10-09 11:04:15 +0100 )edit

I have pushed a non recursive version in https://github.com/sagemath/sage/pull.... I'm currently trying to run it on graphs.SierpinskiGasketGraph(10) to check the memory consumption. It will take some time.

David Coudert gravatar imageDavid Coudert ( 2024-10-09 15:45:16 +0100 )edit
1

I tried on a i9-10900K @ 3.70GHz, 64G RAM, Fedora 39, Sagemath 10.5.beta6. I get the Gomory-Hu tree for graphs.SierpinskiGasketGraph(10) in 5 hours using less than 400MB RAM. This is much better than the previous version.

David Coudert gravatar imageDavid Coudert ( 2024-10-10 19:15:35 +0100 )edit

Thanks a lot, this is awesome news! You also don't see everyday a memory improvement by a factor of 1000 :)

Telijas gravatar imageTelijas ( 2024-10-11 00:21:24 +0100 )edit

David has also fixed the issue with algorithm='LP' - you can give it a try as well.

Max Alekseyev gravatar imageMax Alekseyev ( 2024-10-12 15:25:08 +0100 )edit