2021-08-11 13:02:35 +0200 | received badge | ● Popular Question (source) |

2021-07-26 01:56:33 +0200 | received badge | ● Notable Question (source) |

2020-12-02 18:30:51 +0200 | received badge | ● Popular Question (source) |

2020-06-04 03:23:32 +0200 | received badge | ● Popular Question (source) |

2019-06-06 20:36:08 +0200 | commented question | Explicitly clean all memory usage I think my Sagemath 8.4 is still in Python 2.7 |

2019-06-05 17:45:28 +0200 | asked a question | Explicitly clean all memory usage class State(): def __init__(self): self.value = [] At this point, a huge memory is located. After processing state0.value, I set it back to empty to continue another process. However, the memory is not fully cleaned. Therefore, I cannot continue another process due to my limitation of memory on my computer, and I have to close Sagemath 8.4 to get back to the fresh memory. It is better to iterate, instead of using memory like this; however, I hope that an explicit memory clean exists in Sagemath. Besides it, I use |

2019-03-06 13:34:16 +0200 | commented question | Save/load huge dictionary Thank you @vdelecroix. Side information: my dictionary is in format of keys as tuple (int,int) and values as a list [int]*2110. |

2019-03-04 18:42:32 +0200 | commented question | Save/load huge dictionary @vdelecroix Thank you for your reponse. Could you give me an example? Or something best matched to Sagemath based on your experience? I see this one supported in Sagemath, but not really sure about its quality: http://doc.sagemath.org/html/en/refer... |

2019-03-04 17:08:10 +0200 | asked a question | Save/load huge dictionary I have a huge dictionary about 50 GB. After generating this dictionary, I do not have any space left on my memory. I still run Sagemath standard What should I do? Storing it in multiple files is also fine to me. I really need to load it again to use in the future. Maybe another approach is helpful. Besides the above dictionary, I have another huge redundant dictionary |

2019-03-03 01:25:24 +0200 | asked a question | Divide Combinations(n,k) into multiple parts (n > 1110) I asked a similar question here: https://ask.sagemath.org/question/453.... I closed it, because the answer matched the question there. However, now I have an extra question related to it: Because C_cardinality is too big, I would like to divide C into multiple parts, which are then used on multiple computers to process separately. In order to be sure that these multiple parts are totally different but covering the whole C, I use C's list: However, my computer's RAM (memory 128GB) is not enough to store the Unfortunately, this CSV write is too slow, i.e ~6000 write/min, which means I have to wait about 3800 hours for the whole You might wonder why I need to divide Specifically, I got all Combinations of 3 numbers within I wish the problem is described clearly and someone can support. Thank a lot !!! |

2019-03-01 12:38:22 +0200 | commented question | problem loading large file Is there any update on this matter? I am having the same problem here (the generator was suggested by @slelievre to process further) After 7 days, I got the 1.84 GB file (renamed and uploaded by me, no virus, only Sagemath format file): https://ufile.io/kakhi After downloading/having the file, we load the file: However, it is too large for my RAM. Is there any solution? We might have to split it before "load". Besides it, better solution to save data is appreciated. |

2019-02-25 03:15:10 +0200 | asked a question | Count number of ones in a set of matrix I am having the following code: My goal is to count how many '1s' that I have for each column within the set of matrices. For example with the first 4 matrices in the set: I expect to receive [2,1] as a result, i.e. there are 2 of '1s' appear in the 1st column and 1 of '1s' appears in the 2nd column. However, I got [0, 1], because it's binary base. Thank you for reading my problem and support :) |

2019-02-13 08:44:34 +0200 | commented question | Mapping 2-dimensional matrix index to a list
For more example: (E1) The 1st and 3rd rows could be chosen as parts. (E2) 3 parts: the 1st column, the top-right cell, and the bottom-right cell. The rest is not chosen. Note: a) Parts could be the choice that does not the whole matrix as Example (E2) b) Parts' shapes are only square or rectangle, e.g no L-shape, /-shape, -shape, #-shape, or +-shape and so on. |

2019-02-13 01:54:13 +0200 | asked a question | Mapping 2-dimensional matrix index to a list Hi all, I have a following matrix: I got 3 parts from the matrix: To know that these 3 parts are not overlapping to each other, I have an idea to map the original matrix to 1-dimension array as IDs for each cell: Then again I go over all coordinate that I took for A, B and C to collect these mapped ID. Regarding to A, Regarding to B: Regarding to C, similarly we have: If size of union from these 3 cells is equal to the total size of A, B, and C, then I conclude no overlapping among them. However, this way requires lots of work and |

2019-02-12 00:14:29 +0200 | received badge | ● Scholar (source) |

2019-02-09 09:48:34 +0200 | received badge | ● Supporter (source) |

2019-02-08 23:51:52 +0200 | commented question | Sagemath heap size limit I edited my post adding v: @dan_fulea, I would like to share with you further process. Example: 1) 3) Consider (1,2,3), we know (1,2) can go with 4, or 5 besides 3. If all (1 ... (more) |

2019-02-08 23:09:19 +0200 | received badge | ● Editor (source) |

2019-02-07 21:26:55 +0200 | received badge | ● Student (source) |

2019-02-07 16:02:42 +0200 | asked a question | Sagemath heap size limit Hi all, I am not new to Python, but new to Sagemath. My code: The variable "g3" raises up to about 20 GB in total. Sagemath catches "Memory Error" at some point; therefore, I divide "g3" into 1920 different parts to save. Now I need to process further with "g3", i.e. I need to assign "g3" parts again to some variables to use. The 1st solution that I think of is to create 1920 different variables to store the whole "g3" in my code; however, this way is a bit inconvenient. Is there any better solution? For example, increasing the limit of "list" size (python list type []), which might help me to save up to 11.444.858.880 lists in the list "g3" (about 11 billion is the size from binomial(4096,3)). I have a computer of 128 GB RAM, and it is very nice if I can utilize the computer's strength. There is an old topic on this: trac.sagemath.org/ticket/6772. However, I do not really get the idea there. I wish to be supported :-) Thank you a lot !!! |

Copyright Sage, 2010. Some rights reserved under creative commons license. Content on this site is licensed under a Creative Commons Attribution Share Alike 3.0 license.