2018年3月29日 星期四

[ Git 常見問題 ] Git lfs - “this exceeds GitHub's file size limit of 100.00 MB”

Source From Here 
Question 
I have some csv files that are larger than github's file size limit of 100.00 MB. I have been trying to use the Git Large File Storage extension. From LFS - "Large file versioning- Version large files—even those as large as a couple GB in size—with Git." I have applied the following on the folders of concern: 
# git lfs track "*.csv"

However, when I push: 
remote: error: File Time-Delay-ftn/Raw-count-data-minor-roads1.csv is 445.93 MB; this exceeds GitHub's file size limit of 100.00 MB 
remote: error: File Time-Delay-ftn/Raw-count-data-major-roads.csv is 295.42 MB; this exceeds GitHub's file size limit of 100.00 MB

When I look at the folder in question: 
  1. -rw-r-----   1 user  staff    42B 23 Oct 12:34 .gitattributes  
  2. -rw-r--r--   1 user  staff   1.3K 19 Oct 14:32 DfT_raw_major_manipulation.py  
  3. -rw-r--r--   1 user  staff   1.2K 16 Oct 15:08 DfT_raw_minor_manipulation.py  
  4. drwxr-xr-x  21 user  staff   714B 22 Oct 11:35 Driving/  
  5. -rwxr-xr-x@  1 user  staff   295M 19 Oct 14:47 Raw-count-data-major-roads1.csv*  
  6. -rwxr-xr-x@  1 user  staff   446M 16 Oct 14:52 Raw-count-data-minor-roads1.csv*  
when I vim the .gitattributes file you can see the lfs setup: 
  1. *.csv filter=lfs diff=lfs merge=lfs -text  
What am I doing wrong? When I query 
# git lfs ls-files

I get nothing returned. This indicates that despite the .csv filter being successfully applied to the .gitattributes file the csv files are not being picked up by lfs 

How-To 
Simply adding git-lfs configuration to an existing repository will not retroactively convert your large files to LFS support. Those large files will remain in your history and GitHub will refuse your pushes. You need to rewrite your history to introduce git-lfs to your existing commits. I recommend the BFG repo cleaner tool, which added LFS support recently. 
You should be able to convert historical usage of your CSV files by: 
$ java -jar ~/bfg-1.12.5.jar --convert-to-git-lfs '*.csv' --no-blob-protection


Supplement 
* Git-lfs installation 
# curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.rpm.sh | sudo bash 
# yum install git-lfs 
# git lfs install

* Storing Large Binary Files in Git Repositories 
Storing large binary files in Git repositories seems to be a bottleneck for many Git users. Because of the decentralized nature of Git, which means every developer has the full change history on his or her computer, changes in large binary files cause Git repositories to grow by the size of the file in question every time the file is changed and the change is committed. The growth directly affects the amount of data end users need to retrieve when they need to clone the repository. Storing a snapshot of a virtual machine image, changing its state and storing the new state to a Git repository would grow the repository size approximately with the size of the respective snapshots. If this is day-to-day operation in your team, it might be that you are already feeling the pain from overly swollen Git repositories.

* Git keeps prompting me for password 
Open .git/config and find the [remote "origin"] section. Make sure you're using the SSH one: 
  1. ssh://git@github.com/username/repo.git  

This message was edited 14 times. Last update was at 28/03/2018 16:55:19

沒有留言:

張貼留言

[Git 常見問題] error: The following untracked working tree files would be overwritten by merge

  Source From  Here 方案1: // x -----删除忽略文件已经对 git 来说不识别的文件 // d -----删除未被添加到 git 的路径中的文件 // f -----强制运行 #   git clean -d -fx 方案2: 今天在服务器上  gi...