I'm trying to find a way on macOS, Swift to convert a NSImage that comes from a Image View and to convert it to base64 so I can send it via string to another app.
So far I found some here,
Apparently in Swift 4
extension NSImage {
var base64String: String? {
guard let rep = NSBitmapImageRep(
bitmapDataPlanes: nil,
pixelsWide: Int(size.width),
pixelsHigh: Int(size.height),
bitsPerSample: 8,
samplesPerPixel: 4,
hasAlpha: true,
isPlanar: false,
colorSpaceName: .calibratedRGB,
bytesPerRow: 0,
bitsPerPixel: 0
) else {
print("Couldn't create bitmap representation")
return nil
}
NSGraphicsContext.saveGraphicsState()
NSGraphicsContext.current = NSGraphicsContext(bitmapImageRep: rep)
draw(at: NSZeroPoint, from: NSZeroRect, operation: .sourceOver, fraction: 1.0)
NSGraphicsContext.restoreGraphicsState()
guard let data = rep.representation(using: NSBitmapImageRep.FileType.png, properties: [NSBitmapImageRep.PropertyKey.compressionFactor: 1.0]) else {
print("Couldn't create PNG")
return nil
}
// With prefix
// return "data:image/png;base64,(data.base64EncodedString(options: []))"
// Without prefix
return data.base64EncodedString(options: []))
} }
But the problem is that while I get the image it is super small, and enlarging it in the other app creates a super pixelated app.
In my source app the image based on the representation debug info is a 150 x 200 picture ,but via the code above it comes a 30 x 40 image I guess. any idea how to have this working ?
based on the debug data I have the following :
Optional("<NSImage 0x600003301180 Size={33.600000000000001, 48} Reps=(
"NSBitmapImageRep 0x600002606940 Size={33.600000000000001, 48} ColorSpace=Generic Gray Gamma 2.2 Profile colorspace BPS=8 BPP=8 Pixels=140x200 Alpha=NO Planar=NO Format=0 CurrentBacking=<CGImageRef: 0x10061d900> CGImageSource=0x60000025d6a0"
)>")
so what should I set in my case to have it right ?
Thanks in advance
question from:
https://stackoverflow.com/questions/65882292/xcode-12-3-swift-5-3-nsimage-to-base64 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…