Tracking and managing history of a large directory of binary files

Paul Moore p.f.moore at gmail.com
Tue Feb 18 10:04:13 UTC 2014


First of all, it's entirely possible that Mercurial is completely the
wrong tool for this, but I don't know of any better one. If what I am
doing is totally misguided, can anyone suggest a better solution[1]?

I have a large directory (about 3GB) of binary files (mostly small,
but some up to 50M in size) that I need to manage across multiple PCs.
By "manage" I basically mean that I need to be able to make changes on
any of the PCs and replicate them backwards and forwards. I need to be
able to track history to a limited extent (what were the last couple
of changes made on this PC, and have they been pushed to that one,
merging in changes made on PC 1 and PC 2 into the master copy without
losing information, that type of thing).

If this were source code, I'd use Mercural as the obvious solution.
But for binaries, I'm concerned about manageability of the
repositories involved. They could get very large over time. All of the
relevant repos are "internal", so rewriting history is not completely
out of the question (although obviously it's a management issue I'd
have to consider). So trimming old changesets could be an answer if
there's a reasonably efficient way of doing that. But there may be
other issues I haven't considered, as well.

Can anyone suggest a suitable workflow for this scenario? Or confirm
that Mercurial is not my best answer and suggest any alternatives?

Thanks,
Paul

[1] I am on Windows so Unix-based solutions are unfortunately not
relevant here...



More information about the Mercurial mailing list