How to reduce "hg convert" memory use for big repos?

Gregory Szorc gregory.szorc at gmail.com
Tue Nov 13 02:10:56 UTC 2018


On Sun, Nov 11, 2018 at 9:30 AM Juan Francisco Cantero Hurtado <
iam at juanfra.info> wrote:

> Hi, I've a big git repo [1] with one branch and a linear history. When I
> convert the repo to mercurial, it uses various GB of RAM during the
> conversion. I was going to open a new ticket but maybe there is an
> option to reduce the memory use. I can sacrifice disk space and the time
> of conversion if it's needed. Any idea?.
>

Battling memory leaks in `hg convert` has been a common theme over the
years. You can try converting batches of fewer commits to mitigate the
leaks. I believe it is also safe to gracefully kill `hg convert` at any
time without losing data. So you can run it for a few minutes, kill, and
restart.

When converting from Git, you may also want to look into hg-git. Its Git
<-> Mercurial conversion code has been optimized more and is generally
faster than `hg convert`. It may still have some leaks though. But because
of the way it "batches" inserts into Mercurial, the leaks are probably less
severe.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mercurial-scm.org/pipermail/mercurial/attachments/20181112/4bd10e22/attachment-0002.html>


More information about the Mercurial mailing list