Yes, optimized code is less debuggable. Not only is some information missing, some information will be very misleading.
The biggest issue in my opinion is local variables. The compiler may use the same stack address or register for multiple variables throughout a function. As other posters mentioned, sometimes even figuring out what the "this" pointer is can take a bit of time. When debugging optimized code you may see the current line jumping around as you single step, since the compiler reorganized the generated code. If you use PGO, this jumping around will probably just get worse.
FPO shouldn't affect debuggability too much provided you have a PDB since the PDB contains all the info necessary to unwind the stack for FPO frames. FPO can be a problem when using tools that need to take stack traces without symbols. For many projects, the perf benefit of FPO nowadays doesn't outweigh the hit for diagnosability; for this reason, MS decided not to build Windows Vista with the FPO optimization (http://blogs.msdn.com/larryosterman/archive/2007/03/12/fpo.aspx).
I prefer to debug unoptimized code but this isn't always possible - some problems only repro with optimized code, customer crash dumps are from the released build, and getting a debug private deployed sometimes isn't possible. Often when debugging optimized code, I use the disassembly view - dissasembly never lies.
This all applies to windbg since I do all native code debugging with it. Visual Studio's debugger might handle some of these cases better.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…