I'm using pip with virtualenv to package and install some Python libraries.
I'd imagine what I'm doing is a pretty common scenario. I'm the maintainer on several libraries for which I can specify the dependencies explicitly. A few of my libraries are dependent on third party libraries that have transitive dependencies over which I have no control.
What I'm trying to achieve is for a pip install
on one of my libraries to download/install all of its upstream dependencies. What I'm struggling with in the pip documentation is if/how requirements files can do this on their own or if they're really just a supplement to using install_requires
.
Would I use install_requires
in all of my libraries to specify dependencies and version ranges and then only use a requirements file to resolve a conflict and/or freeze them for a production build?
Let's pretend I live in an imaginary world (I know, I know) and my upstream dependencies are straightforward and guaranteed to never conflict or break backward compatibility. Would I be compelled to use a pip requirements file at all or just let pip/setuptools/distribute install everything based on install_requires
?
There are a lot of similar questions on here, but I couldn't find any that were as basic as when to use one or the other or using them both together harmoniously.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…