When I want to do debugging of C or C++ programs, I've been taught to use -O0
to turn optimization OFF, and -ggdb
to insert symbols into the executable which are optimized for using the GNU gdb
debugger, which I use (or, you can use -glldb
for LLVM/clang's lldb
debugger, or just -g
for general debugging symbols, but that won't be as good as -ggdb
apparently...). However, I recently stumbled upon someone saying to use -Og
(instead of -O0
), and it caught me off-guard. Sure enough though, it's in man gcc
!:
-Og
Optimize debugging experience. -Og
enables optimizations that do not interfere with debugging. It should be the optimization level of choice for the standard edit-compile-debug cycle,
offering a reasonable level of optimization while maintaining fast compilation and a good debugging experience.
So, what's the difference? Here's the -O0
description from man gcc
:
-O0
Reduce compilation time and make debugging produce the expected results. This is the default.
man gcc
clearly says -Og
"should be the optimization level of choice for the standard edit-compile-debug cycle", though.
This makes it sound like -O0
is truly "no optimizations", whereas -Og
is "some optimizations on, but only those which don't interfere with debugging." Is this correct? So, which should I use, and why?
Related:
- related, but NOT a duplicate! (read it closely, it's not at all a duplicate): What is the difference between -O0 ,-O1 and -g
- my answer on debugging
--copt=
settings to use with Bazel: gdb: No symbol "i" in current context
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…