Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
387 views
in Technique[技术] by (71.8m points)

limiting memory usage in R under linux

We are running R in a linux cluster environment. The head node has had a few hangs when a user has inadvertently taken all the memory using an R process. Is there a way to limit R memory usage under linux? I'd rather not suggest global ulimits, but that may be the only way forward.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

There's unix::rlimit_as() that allows setting memory limits for a running R process using the same mechanism that is also used for ulimit in the shell. Windows and macOS not supported.

In my .Rprofile I have

unix::rlimit_as(1e12, 1e12)

to limit memory usage to ~12 GB.

Before that...

I had created a small R package, ulimit with similar functionality.

Install it from GitHub using

devtools::install_github("krlmlr/ulimit")

To limit the memory available to R to 2000 MiB, call:

ulimit::memory_limit(2000)

Now:

> rep(0L, 1e9)
Error: cannot allocate vector of size 3.7 Gb

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

1.4m articles

1.4m replys

5 comments

57.0k users

...