I was pushing a commit last week. Nothing crazy, just some screenshots I'd added to a project repo. Maybe 1.4 MB of images total. And I got this:

error: RPC failed; HTTP 400 curl 22 The requested URL returned error: 400
send-pack: unexpected disconnect while reading sideband packet
fatal: the remote end hung up unexpectedly

Okay, fine, push failed. Annoying but things happen. So I ran git push again and got:

Everything up-to-date

Wait, what? It literally just told me the remote hung up. Now it's saying everything is fine?

I stared at this for longer than I'd like to admit before I figured out what was going on.

What's actually happening

Git doesn't push your files individually. When you run git push, it packs everything into a single compressed payload and sends it as one HTTP request. Think of it like zipping a folder and uploading the zip. If that zip is bigger than the HTTP buffer Git is configured to use, the whole thing falls over.

The default buffer is 1 MB. My images were about 1.4 MB. You'd think that's close enough, but Git adds its own metadata and encoding on top of the raw file sizes, so the actual HTTP payload was bigger than what shows up in git status.

The "Everything up-to-date" message is the really confusing part. What happens is Git successfully started the push, then the connection died mid-transfer. On the next attempt, Git looks at its local state and sees "I already tried to push these refs" and assumes it went through. It didn't. Your remote is behind and Git is lying to you.

The one-line fix

git config --global http.postBuffer 524288000

That bumps the HTTP post buffer to 500 MB. After running that, the push went through instantly on the next try.

I set it globally because I don't want to think about this ever again. If you'd rather scope it to a single repo, drop the --global flag and run it from inside the repo directory.

Why the default is so low

I honestly don't know why Git ships with a 1 MB buffer in 2026. It made sense maybe 15 years ago when repos were mostly source code and people pushed over dial-up. These days a single screenshot can be 1 MB. A few SVGs and you're already over the limit.

My guess is it's a conservative default that nobody on the Git team has had reason to revisit, and SSH users (who don't hit this issue at all) probably outnumber HTTPS users in the contributor base.

Situations where this will bite you

The obvious one is images. I was adding project screenshots and that was enough. But I've also seen this happen with:

Build artifacts that accidentally got committed. If your .gitignore is missing an entry for .next/ or dist/ and someone commits a production build, that push is going to fail.

First commits on a new repo where you're pushing everything at once. Even if individual files are small, the combined pack can be big.

Large dependency-related files. I once had a teammate commit a package-lock.json that was somehow 4 MB (monorepo, lots of packages). Same error.

The common thread is HTTPS remotes. SSH doesn't have this buffer limitation, so if you're using SSH URLs for your remotes you'll probably never see this error.

Things I'd actually recommend

Increasing the buffer fixes the immediate problem, but there are a few things worth doing if you keep running into this:

Switch to SSH. It's better for pushing in general. No buffer issues, no credential prompts if you have your key set up. I should have switched years ago.

git remote set-url origin git@github.com:your-username/your-repo.git

Use Git LFS for media. If your repo genuinely needs to store images or videos (not just a few screenshots but actual assets), LFS is designed for exactly this. It stores the large files separately and keeps your Git history clean.

git lfs track "*.png"
git lfs track "*.mp4"

Check your .gitignore. Seriously. Run git status before you commit and actually look at what's being staged. I've caught build folders, .env files, and entire node_modules directories that were about to get pushed because someone forgot to update .gitignore when they added a new tool.

Push smaller commits more often. I'm bad at this. I'll work on three things and then push one big commit at the end of the day. Breaking that up into smaller pushes reduces the pack size per push and makes this kind of thing less likely.

The thing that surprised me

My total push was about 1.4 MB of images. That's not a lot. I expected Git's buffer to handle it without issue. But once Git compresses the data, adds pack headers, includes the ref metadata, and wraps it all in HTTP framing, the actual wire payload is noticeably bigger than the raw file sizes suggest.

So if you're close to the limit, you're probably over it. Just bump the buffer and move on. Life's too short to debug HTTP 400s from Git.


References