How to reduce "hg convert" memory use for big repos?

Juan Francisco Cantero Hurtado iam at juanfra.info
Wed Nov 14 15:21:53 UTC 2018


On 13/11/18 3:10, Gregory Szorc wrote:
> On Sun, Nov 11, 2018 at 9:30 AM Juan Francisco Cantero Hurtado <
> iam at juanfra.info> wrote:
> 
>> Hi, I've a big git repo [1] with one branch and a linear history. When I
>> convert the repo to mercurial, it uses various GB of RAM during the
>> conversion. I was going to open a new ticket but maybe there is an
>> option to reduce the memory use. I can sacrifice disk space and the time
>> of conversion if it's needed. Any idea?.
>>
> 
> Battling memory leaks in `hg convert` has been a common theme over the
> years. You can try converting batches of fewer commits to mitigate the
> leaks. I believe it is also safe to gracefully kill `hg convert` at any
> time without losing data. So you can run it for a few minutes, kill, and
> restart.
> 
> When converting from Git, you may also want to look into hg-git. Its Git
> <-> Mercurial conversion code has been optimized more and is generally
> faster than `hg convert`. It may still have some leaks though. But because
> of the way it "batches" inserts into Mercurial, the leaks are probably less
> severe.

I had similar problems when I converted the same repo with hg-git. It 
uses too much RAM.


-- 
Juan Francisco Cantero Hurtado http://juanfra.info



More information about the Mercurial mailing list