Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
813 views
in Technique[技术] by (71.8m points)

performance - How to speed up / parallelize downloads of git submodules using git clone --recursive?

Cloning git repositories that have a lot submodules takes a really long time. In the following example are ~100 submodules

git clone --recursive https://github.com/Whonix/Whonix

Git clones them one by one. Takes much longer than required. Let's make the (probable) assumption that both the client and the server has sufficient resources to answer multiple (parallel) requests at the same time.

How to speed up / parallelize downloads of git submodules using git clone --recursive?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

With git 2.8 (Q12016), you will be able to initiate the fetch of submodules... in parallel!

See commit fbf7164 (16 Dec 2015) by Jonathan Nieder (artagnon).
See commit 62104ba, commit fe85ee6, commit c553c72, commit bfb6b53, commit b4e04fb, commit 1079c4b (16 Dec 2015) by Stefan Beller (stefanbeller).
(Merged by Junio C Hamano -- gitster -- in commit 187c0d3, 12 Jan 2016)

Add a framework to spawn a group of processes in parallel, and use it to run "git fetch --recurse-submodules" in parallel.

For that, git fetch has the new option:

-j, --jobs=<n>

Number of parallel children to be used for fetching submodules.
Each will fetch from different submodules, such that fetching many submodules will be faster.
By default submodules will be fetched one at a time.

Example:

git fetch --recurse-submodules -j2

The bulk of this new feature is in commit c553c72 (16 Dec 2015) by Stefan Beller (stefanbeller).

run-command: add an asynchronous parallel child processor

This allows to run external commands in parallel with ordered output on stderr.

If we run external commands in parallel we cannot pipe the output directly to the our stdout/err as it would mix up. So each process's output will flow through a pipe, which we buffer. One subprocess can be directly piped to out stdout/err for a low latency feedback to the user.


Note that, before Git 2.24 ( Q4 2019), "git fetch --jobs=<n>" allowed <n> parallel jobs when fetching submodules, but this did not apply to "git fetch --multiple" that fetches from multiple remote repositories.
It now does.

See commit d54dea7 (05 Oct 2019) by Johannes Schindelin (dscho).
(Merged by Junio C Hamano -- gitster -- in commit d96e31e, 15 Oct 2019)

fetch: let --jobs=<n> parallelize --multiple, too

Signed-off-by: Johannes Schindelin

So far, --jobs=<n> only parallelizes submodule fetches/clones, not --multiple fetches, which is unintuitive, given that the option's name does not say anything about submodules in particular.

Let's change that.
With this patch, also fetches from multiple remotes are parallelized.

For backwards-compatibility (and to prepare for a use case where submodule and multiple-remote fetches may need different parallelization limits):

  • the config setting submodule.fetchJobs still only controls the submodule part of git fetch,
  • while the newly-introduced setting fetch.parallel controls both (but can be overridden for submodules with submodule.fetchJobs).

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...