Hello,
I want to create a painting app for iOS and I saw many examples use a CAShapeLayer to draw a UIBezierPath.
As I understand CoreAnimation uses the GPU so I was wondering how is this implemented on the GPU? Or in other words, how would you do it with Metal or OpenGL?
I can only think of continuously updating a texture in response to the user's drawing but that would be a very resource intensive operation...
Thanks
We're unable to validate any assumptions about Core Animation implementation, particularly based on third party citations, but we are able to offer you sample code showing how you might implement your own solution on the GPU.
For examples of rendering parametric curves with Metal you'll want to check out the following:
- Build a spatial drawing app with RealityKit
- Creating a spatial drawing app with RealityKit
- Interacting with virtual content blended with passthrough
These samples run on the Simulator. You would need to adapt input for iOS.
A simpler "stamping" approach using Core Image and OpenGL (iOS and macOS) can be found here: