I'm developing a macOS application using Swift and a camera extension. I'm utilizing the Vision framework's VNGeneratePersonSegmentationRequest to apply a background blur effect. However, I'm experiencing significant lag in the video feed. I've tried optimizing the request, but the issue persists. Could anyone provide insights or suggestions on how to resolve this lagging issue?
Details:
Platform: macOS
Language: Swift
Framework: Vision
code snippet I am using are below
`class ViewController: NSViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
var frameCounter = 0
let frameSkipRate = 2
private let visionQueue = DispatchQueue(label: "com.example.visionQueue")
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
frameCounter += 1
if frameCounter % frameSkipRate != 0 {
return
}
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
performPersonSegmentation(on: ciImage) { [self] mask in
guard let mask = mask else { return }
let blurredBackground = self.applyBlur(to: ciImage)
let resultImage = self.composeImage(with: blurredBackground, mask: mask, original: ciImage)
let nsImage = ciImageToNSImage(ciImage: resultImage)
DispatchQueue.main.async { [self] in
// Update your NSImageView or other UI elements with the composite image
if needToStream {
if (enqueued == false || readyToEnqueue == true), let queue = self.sinkQueue {
enqueued = true
readyToEnqueue = false
if let _ = image, let cgImage = nsImage.cgImage(forProposedRect: nil, context: nil, hints: nil) {
enqueue(queue, cgImage)
}
}
}
}
}
}
private func performPersonSegmentation(on image: CIImage, completion: @escaping (CIImage?) -> Void) {
let request = VNGeneratePersonSegmentationRequest()
request.qualityLevel = .fast // Adjust quality level as needed
request.outputPixelFormat = kCVPixelFormatType_OneComponent8
let handler = VNImageRequestHandler(ciImage: image, options: [:])
visionQueue.async {
do {
try handler.perform([request])
guard let result = request.results?.first as? VNPixelBufferObservation else {
completion(nil)
return
}
let maskPixelBuffer = result.pixelBuffer
let maskImage = CIImage(cvPixelBuffer: maskPixelBuffer)
completion(maskImage)
} catch {
print("Error performing segmentation: \(error)")
completion(nil)
}
}
}
private func composeImage(with blurredBackground: CIImage, mask: CIImage, original: CIImage) -> CIImage {
// Invert the mask to blur the background
let invertedMask = mask.applyingFilter("CIColorInvert")
// Ensure mask is correctly resized to match original image
let resizedMask = invertedMask.transformed(by: CGAffineTransform(scaleX: original.extent.width / invertedMask.extent.width, y: original.extent.height / invertedMask.extent.height))
// Blend the images using the mask
let blendFilter = CIFilter(name: "CIBlendWithMask")!
blendFilter.setValue(blurredBackground, forKey: kCIInputImageKey)
blendFilter.setValue(original, forKey: kCIInputBackgroundImageKey)
blendFilter.setValue(resizedMask, forKey: kCIInputMaskImageKey)
return blendFilter.outputImage ?? original
}
private func ciImageToNSImage(ciImage: CIImage) -> NSImage {
let cgImage = context.createCGImage(ciImage, from: ciImage.extent)!
return NSImage(cgImage: cgImage, size: ciImage.extent.size)
}
private func applyBlur(to image: CIImage) -> CIImage {
let blurFilter = CIFilter.gaussianBlur()
blurFilter.inputImage = image
blurFilter.radius = 7.0 // Adjust the blur radius as needed
return blurFilter.outputImage ?? image
}
}`
Apple Developers
RSS for tagThis is a dedicated space for developers to connect, share ideas, collaborate, and ask questions. Introduce yourself, network with other developers, and foster a supportive community.
Post
Replies
Boosts
Views
Activity
Hi everyone,
I am a beginner in Swift and I am currently using DeviceActivityMonitor to develop an app for parents to control their children's app usage time. During the development process, I found that the DeviceActivityMonitor has a 6MB memory limit in the background, and my app exceeds this limit after running for a long time. I would like to know how to reduce the memory usage of DeviceActivityMonitor in the background, or where I can check the memory usage of DeviceActivityMonitor and see which variables are consuming memory.
Additionally, I want to know if a new instance of the DeviceActivityMonitor class is created every time it is called?
Thank you for your help!
My screen keeps going out ever since I updated to iOS 18. I can use Bluetooth without it happenin, but not CarPlay
Olá a Apple rejeitou meu app, porque disse que após a splash screen a tela fica branca por muito tempo e não apareceu o conteúdo, tem como diminuir esse tempo da tela branca?
There is definitely a glitch with the latest update of iOS 18 Beta4 with Google Maps in carplay mode.
Google Maps on the car screen is almost responsive. It does not recognize your home or work address.
While there might be a workaround to get to your destination.
Use Siri to get directions, but speak the full address and tell it to give directions using Google Maps.
For e.g. "Hey Siri, get me directions to "XX LOCATION" using Google Maps"
You can use Apple Maps and Waze.
But its hard for people to use another navigation app rather than using what they are used to.
I find Google Maps really easy to use and handy, and now, with the latest update of Google Maps you can update live incidents, accidents, slowdowns and traffic reports from your dashboard
Too many irrelevant posts come up when I attempt to search for something. There should be an easy button or something to earmark posts as something irrelevant- something that shouldn't come up in a search. I'm looking for something in Xcode 15 and frequently the top posts in the search are 10 years old and don't have any relevance whatsoever to the solution to my problem.
Hi All,
Did some searching and found some wildly differing answers: how long did it take for your developer account to be approved?
How do I correctly show a PDF document?
iPad and Xcode 15.4
Within my GameViewController, I have:
func presentScene(_ theScene: SKScene) {
theScene.scaleMode = .resizeFill
if let skView = self.view as? SKView {
skView.ignoresSiblingOrder = true
skView.showsFPS = true
skView.showsNodeCount = true
#if os(iOS)
let theTransition = SKTransition.doorway(withDuration: 2.0)
skView.presentScene(theScene, transition: theTransition)
#elseif os(tvOS)
skView.presentScene(theScene)
#endif
}
} // presentScene
I believe presentScene(theScene) goes to the sceneDidLoad() func of theScene which adds various SKSpriteNodes to the scene via a call to addChild(theNode).
So far so good ...
Until I have a SKScene wherein I wish to display a PDF.
I use this snippet to display this PDF with this call within the SKScene's sceneDisLoad():
displayPDF("ABOUT_OLD_WEST_LOCOMOTIVE")
func displayPDF(_ itsName:String) {
let pdfView = PDFView()
guard let path = Bundle.main.path(forResource: itsName, ofType: "pdf")
else { return }
print("path")
guard let pdfDocument = PDFDocument(url: URL(fileURLWithPath: path))
else { return }
print("pdfDocument")
pdfView.displayMode = .singlePageContinuous
pdfView.autoScales = true
pdfView.displayDirection = .vertical
pdfView.document = pdfDocument
} // displayPDF
The 2 print statements show, yet the SKScene does not display the PDF via pdfView.document = pdfDocument?
Anyone have a clue what errors I have committed?
Appreciate it.
Hi,
I am a newbie to Apple development. I do have good experience in PC development. What resources would I use to begin learning? I am planning to develop an app for personal interests.
I know there are online courses on Coursera and such. But am hoping to do it at low cost and am not interested in certification for job search, etc.
Thanks much.
I know there is plenty of time left but if you could get iOS 18 beta to go to a black screen with gray wheel less, I’d be appreciative. It happened three times while writing this. Of course I entered a bug for it. Please and thank you. I am running developer build 4.
It appears that even on iOS 18 beta 4, the deployment of a share extension target to a physical device is broken. Has anyone seen the message, "The specified capability is not supported by this device"? Just a basic iOS app, no logic defined at this point. It builds fine. The main target gets deployed fine, only when attempting to deploy the share extension do I see this message. Works fine on a simulator, just not on physical device. iOS 18 beta 4, Xcode 16 beta 4 and MacOS 15 beta.
Hello guys, I cannot agree the transfer account holder agreement, the steps are shown below
Steps:
I went to Apple Developer website, clicked "Account" button and saw the message
"""
*** has requested that we transfer the Account Holder role to you.
You will become the new Account Holder for your organization and will assume the responsibilities of accepting all legal agreements, managing App Store submissions, and renewing your membership. Before you can accept, you’ll need to complete identity verification in the Apple Developer app.
"""
I then went to Apple Developer app, clicked the Account Tap, and saw no agreement popping up. I only saw the un-clickable "Enroll now" in "Apple Developer Program", but I already filled in the personal information for identity verification several weeks ago, I resubmitted the personal information by contacting them, however, there is no updates on this from their side.
Do you guys have any solution? I need to deploy the app as soon as possible for an upcoming event, therefore it's an urgency for the company, thanks so much for your help.
I would like some information, I would like to know the diagonal dimensions of the rear glass panel where the cameras for the iPhone 15 Pro Max are placed
Just bought a refurbished iPad for our son. Created an Apple ID for this device. When we enter the Apple ID and password we get a Verification Failed message that reads “verification codes can’t be sent to this phone number at this time.” This is an WiFi only device and we have not connected it to a phone number. Any help?
Hello everyone,
I am struggling to find a solution for the following problem, and I would be glad and thankful if anyone can help me.
My Use Case:
I am using RoomPlan to scan a room. While scanning, there is a function to take pictures. The position from where the pictures are taken will be saved (in my app, they are called "points of interest" = POI).
This works fine for a single room, but when adding a new room and combining the two of them using:
structureBuilder.capturedStructure(from: capturedRooms)
the first room will be transformed and thus moved around to fit in the world space.
The points are not transformed with the rest of the room since they are not in the rooms structure specifically, which is fine, but how can I transform the POIs too, so that they are in the correct positions where they were taken?
I used:
func captureSession(_ session: RoomCaptureSession, didEndWith data: CapturedRoomData, error: (Error)?)
to get the transform matrix from "arFrameReferenceOriginTransform" and apply this to the POIs, but it still seems that this is not enough.
I would be happy for any tips and help!
Thanks in advance!
My Update function:
func updatePOIPositions(with originTransform: simd_float4x4) {
for i in 0..<(poisOldRooms.count) {
var poi = poisOldRooms[i]
let originalPosition = SIMD4<Float>(
poi.data.cameraOriginX,
poi.data.cameraOriginY,
poi.data.cameraOriginZ,
1.0
)
let updatedTransform = originTransform * originalPosition
poisOldRooms[i].data.cameraX = updatedTransform.x
poisOldRooms[i].data.cameraY = updatedTransform.y
poisOldRooms[i].data.cameraZ = updatedTransform.z
}
}
I am using Instagram Basic Display API in my app, do I have to submit the app to Facebook first before submitting it to Apple?
Or I can submit app to Apple first and provide test account for Apple reviewer?
Do you think Apple could make the rear cameras even bigger or will they remain the same size as the 15 pro max?
In situations where the app receives a VoIP push, and the user starts to answer by sliding, the call is initiated and the timer starts. However, due to network issues, the app's call may not be fully ready, resulting in a delay of 5-10 seconds before the actual call begins. Is there a way to display a "loading" or "connecting" indicator on the CallKit interface during this wait time?
Hello,
We are currently using Hub which uses CYPD3125 PD chip, It is used to connect with both Android and iOS devices. While our device works seamlessly with Android devices, we are encountering an issue when connecting to iOS devices, specifically the iPad Pro.
Issue Description:
The Powerpack/Hub is intended to handle Power Delivery (PD) communications.
When connected to an Android device, the PD packets are exchanged correctly, and the device functions as expected.
However, when connected to an iPad Pro, we observe abnormal PD packet exchanges which lead to malfunctioning of the Powerpack/Hub.
Observations:
Attached is a snapshot of the PD packets we captured while troubleshooting the issue in a scenario where the AC power adapter was initially connected. After a few seconds, we removed the plug, waited for a few seconds, and then plugged in the AC power again. This was the scenario when we captured the PD packets, as seen in the snapshot. The packets appear to be different when compared to those captured with an Android device.
Below is the screenshot of the PD packet capture with Apple device:
Below is the screenshot of the PD packet capture with Android device:
Technical Observations:
Initial Connection: The connection initiates but does not follow the expected PD communication sequence.
Packet Structure: In the capture, the iPad Pro shows a series of PD Msg types including Src Cap, Req, and Accept, but there are also unexpected messages such as Hard Reset and Soft Reset that disrupt the communication.
Timing Issues: The timestamps show irregular intervals between packets when connected to the iPad Pro, suggesting possible timing synchronization issues.
Unexpected Resets: The capture shows a Hard Reset event at packet 9, which is not observed in the Android device captures. This suggests the iPad Pro might be detecting an error and attempting to reset the connection.
Steps Taken:
Verified the firmware and hardware implementation of the Powerpack/Hub.
Ensured compliance with USB PD standards.
Tested with multiple iPad Pro units to rule out device-specific issues.
Additional Details: We have also tested with iPad Air and observed the same issue. The tests were conducted on both iOS version 16 and 17. We are attaching a USB PD capture with an Android device where it is working fine as expected. The PD packets were captured in a scenario where the AC power adapter was initially connected. After a few seconds, we removed the plug, waited for a few seconds, and then plugged in the AC power again. This was the scenario when we captured the PD packets, as seen in the snapshot.
Despite these steps, the issue persists. We seek guidance on any issues or peculiarities with iOS devices and USB PD communication.
Thanks
Dear Dev Team,
I am writing to report a severe battery drain issue after updating my Apple Watch to watchOS 11 (22R5318h). Since the update, my watch’s battery life has drastically reduced, going from 100% to 0% in just 4 hours.
I have taken the following steps to resolve the issue, but none have been successful:
1. Performed a fresh reset of the watch.
2. Reset my iPhone.
3. Disabled background app activity.
4. Enabled airplane mode and power saving mode.
Despite these efforts, the problem persists. My watch’s battery health is at 100%, so I believe the issue is related to the software update.
I would appreciate your assistance in resolving this matter as it is significantly affecting the usability of my Apple Watch.
Thank you for your attention to this issue.
——————-
Update 1: I have rolled back from iOS 18 beta 4 to iOS 18 beta 3 after reading that the update might be causing the issue with the Apple Watch. Following this, I reset my Apple Watch, including the cellular plan, and did not restore from a backup. However, the problem still persists.
Additionally, I have noticed that the charging time from 30% to 100% has increased to 3 hours and 23 minutes using a 20W Apple charger.