A shader array variable is just a blob of data passed from the GL client to the GPU, with some size and layout agreed upon in the shader code.
So, to do this in SceneKit, you'll need to first set up your blob, then wrap it in an NSData
to pass to the shader. e.g.
// warning: coded in Safari and untested
import simd
let points: [float2] = [[100.5, 50.5], [110.5, 60.5], /* ... */ ]
// make sure you have 100...
let data = NSData(bytes: points, length: points.count * sizeof(float2))
material.setValue(data, forKey: "points")
Also, when you're working with single points, SceneKit will handle the layout conversion from CGPoint
to GPU vec2
, but when you pass a blob of data, all SceneKit knows is that it's a blob of data... it can't look inside and adjust the layout for what the GPU might expect, because it doesn't know what the layout you provided is.
So, you should use types that explicitly match the size and layout that you want in the GPU shader. That means no CGPoint
—its components are CGFloat
, and that type's size changes between CPU architectures. The float2
type from the SIMD library is a close match for GL vec2
and an exact match for Metal float2
.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…