Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hi, I’m curious here would git-lfs (large file storage) work along with rewrite the history to remove the larger files and artifacts?

    git filter-branch --index-filter "git rm -rf --cached --ignore-unmatch /path/to/file" HEAD

https://stackoverflow.com/questions/43762338/how-to-remove-f...

Or is git-lfs generally avoided for good reason? Or so all the changes and not necessarily large files cause this repo to exceed the limit?



Then you're rewriting history. And this should be avoided at all cost, as git hashes are part of the cryptographic history of a repo.

Who are you going to trust to do this? Someone will need to execute this and then do a force push. Are you going to compare it? Are you going through all the history of files removed to see if there is no change to the source code? And what about the files you're actually going to move to git-lfs? How can you prove that they haven't changed in the migration?

Provenance is a thing. https://slsa.dev/provenance/v0.2


I'll admit I've done this early on in history of a pet project repo scrubbing a credential :/, not implementing git-lfs though.

I knew that rewrites compromise the history. It is low stakes and I didn't want to allocate a new repo to start anew, just learned my lesson there.

I'm mostly curious about git-lfs as large file viability or it should be avoided and plan for large build artifacts to host elsewhere.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: