Resolving the git error “pack exceeds maximum allowed size” during push

Elephant_near_ndutuI have a large repository that takes up a modest number of gigabytes. When attempting to push it to a  new remote repository,  the push failed, complaining that the pack size exceeds the maximum allowed.

First of all, let’s get out of the way the fact that repacking the local repository or fiddling with the pack.packSizeLimit limit configuration setting won’t fix the problem. That will simply tidy  up your local machine.

As I understand it (corrections welcome), the problem is a collision of several things. When performing this massive beginning-to-end push, git creates a massive pack on  the fly and pipes that across the network to the remote machine. The remote machine needs  to be able to perform memory mapping on this huge wad of data. File system, CPU architecture, and memory needs have to be satisfied for this to work. Otherwise, the pack size error is reported and the push fails. Annoyingly, this can happen after you’ve transferred gigabytes of data across a network with a bottleneck, completely wasting a lot of time.

Fortunately the work-around is simple. Push the repository in chunks, working your way up the tree.

If your repository has a lot of branching, you may be able to push a branch at a time, as the generated pack will be for that branch.

This repo of mine has a very linear history, and feeling a little lazy I used my git GUI (SourceTree) to make a temporary branch about a third way up the tree, and pushed that. I moved the temporary branch another third of the way up the tree, and pushed that. Finally I could push master and remove the temporary branch.

If the repository were big and hairy enough one could write a script to traverse the tree and programmatically push at appropriate commit points, but for me it’s an exceptional situation that doesn’t warrant that type of effort.


This entry was posted in SysAdmin and tagged , , , , , , , . Bookmark the permalink.

Leave a Reply