I'm currently testing a hardware based compression algorithm, but I'm having a difficult time getting the an accurate snapshot of the physically allocated RAM that is going to be compressed.
Currently I'm loading data (from a csv) into RAM using Apache Arrow and then performing a core dump with gcore to get the address space. The problem with this is that the resultant file is massive (like 30gb) and filled with junk data. Basically I'm looking to find a way to dump the physical memory that a process is using instead of the entire address space.
Are there any distros out there that allow physical memory dumping for a given process?
question from:
https://stackoverflow.com/questions/65868833/physically-allocated-memory-from-core-dump 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…