ScreenCaptureKit

RSS for tag

ScreenCaptureKit brings high-performance screen capture, including audio and video, to macOS.

Posts under ScreenCaptureKit tag

37 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Is it possible to get only audio from ScreenCaptureKit?
I'm creating app that listening other app's sound. in this use case, screen data is not needed. but if I don't call SCStream#addStreamOutput(_, type: .screen, ...), console shows this error: [ERROR] _SCStream_RemoteVideoQueueOperationHandlerWithError:701 stream output NOT found. Dropping frame currently I'm setting SCStreamConfiguration#minimumFrameInterval to large value (e.g. 0.1fps) as workaround, but it would be good if i can completely disable screen capture for best performance. there is any way to disable screen capture and only captures apps audio?
4
1
1.8k
Sep ’23
ScreenCaptureKit - Sample project doesn't work on macOS Sonoma
I'm getting this error when I try to run the ScreenCaptureKit sample project on macOS Sonoma: [ERROR] _SCStream_RemoteAudioQueueOperationHandlerWithError:1,053 streamOutput NOT found. Dropping frame [ERROR] _SCStream_RemoteVideoQueueOperationHandlerWithError:1,020 stream output NOT found. Dropping frame Both streamOutput are being set like this: try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) Link to sample project https://developer.apple.com/documentation/screencapturekit/capturing_screen_content_in_macos Any idea of what is causing this?
3
5
1.3k
Aug ’23
SCFilter does not support multiple windows as the target source
SCFilter in ScreenCaptureKit has 5 methods to select the capture target. Only one method initWithDesktopIndependentWindow: accepts one SCWindow as its capture target. The other four methods target one SCDisplay and extra SCRunningApplication or SCWindow combinations to filter out items by their criteria. Compared to the old API to capture windows or displays snapshots: CGImageRef CGWindowListCreateImage(CGRect screenBounds, CGWindowListOption listOption, CGWindowID windowID, CGWindowImageOption imageOption); CGImageRef CGWindowListCreateImageFromArray(CGRect screenBounds, CFArrayRef windowArray, CGWindowImageOption imageOption); The second method accepts an array of CGWindowID, so the result is a composite image formed from these windows. Will Apple provide a substitute in SCFilter for this method in the future?
0
0
439
Jul ’23
Screen recording- ScreenCaptureKit or AVFoundation?
Hi! I am building a Mac swift app to make screen recordings. I want to be able to record all the screen, a window or an area of the screen. I am very new to swift and am not sure if I should be using AVCaptureSession or SCScreenCaptureSession. I am trying to understand the differences and when I should use one over the other can anyone point me in the right direction? Thanks!
2
0
1.2k
Aug ’23
ScreenCaptureKit - Sample project need obtain screen recording permissions every time.
Get screen recording permission window show every time when I run the ScreenCaptureKit sample project on macOS Sonoma Beta5: Even though I've already provided permissions in the screen recording permissions: This box keeps popping up, and until I remove the old recorded "CaptureSample" screen-recording permissions in "System Settings->Screen Recording" and add them again, it won't be able to record. This issue happens many times a day! Does anyone know how to fix it? Thanks~
2
0
733
Sep ’23
Is there a delegate for the stop sharing button in Sonoma?
Hey team -- looking to receive a delegate callback in Sonoma for screencapturekit when the toolbar portion of stop sharing is called. Does this come through in any of these? /** @abstract stream :didStopStreamWithError: @param stream the SCStream object @param error the error denoted by the stopping of the stream @discussion notifies the delegate that the stream has stopped and the error associated with it */ optional func stream(_ stream: SCStream, didStopWithError error: Error) /** @abstract outputVideoEffectDidStartForStream: @param stream the SCStream object @discussion notifies the delegate that the stream's overlay video effect has started. */ @available(macOS 14.0, *) optional func outputVideoEffectDidStart(for stream: SCStream) /** @abstract stream:outputVideoEffectDidStart: @param stream the SCStream object @discussion notifies the delegate that the stream's overlay video effect has stopped. */ @available(macOS 14.0, *) optional func outputVideoEffectDidStop(for stream: SCStream)
1
0
447
Sep ’23
Issue with ScreenCaptureKit Performance(After Sonoma updates)
I'm encountering performance degradation with my application that utilizes ScreenCaptureKit. Even after explicitly disabling App Nap using the NSAppSleepDisabled key, the problem persists. My application, which relies heavily on ScreenCaptureKit for its core functionality, experiences significant performance drops after running for a short period. When I click on the application, the performance momentarily returns to normal but quickly deteriorates again. I've checked for memory leaks in my application and haven't found any issues in that regard. Has anyone experienced similar performance issues with ScreenCaptureKit? I'm keen to know if there are any known bugs or workarounds to mitigate this problem.
0
0
536
Oct ’23
Presenter overlay causes AVAssetWriter failure
I work on a screen recorder app and having issues with the new presenter overlay mode on macOS 14. Switching to the "Small" overlay is fine, but switching to the "Large" overlay mode causes our AVAssetWriter to fail every time with the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-16364), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x6000028729a0 {Error Domain=NSOSStatusErrorDomain Code=-16364 "(null)"}} which doesn't provide any helpful information. I'm not sure if we're doing something wrong, but I've tried to reduce our code as much as possible and still get the crash. I'm not sure if anyone has any clues or is experiencing the same thing? Alternatively, is there a way to disable presenter overlay until it's fixed? Our app displays a camera and uses ScreenCaptureKit to record the screen along with the camera, which automatically enables presenter overlay options. I can't find any way to opt-out or turn off the presenter overlay options which is a bummer. That seems like it should be controllable from either the AVCaptureSession or SCStreamConfiguration
2
0
480
Oct ’23
Why is the image captured by `SCScreenshotManager.captureImage` so blurry?
I am using ScreenCaptureKit to create a screenshot software, but I found that the screenshot captured by the new API, SCScreenshotManager.captureImage, is very blurry. This is my screenshot. It is so blurry. But I hope it's like this. My code is as follows. func captureScreen(windows: [SCWindow], display: SCDisplay) async throws -> CGImage? { let availableWindows = windows.filter { window in Bundle.main.bundleIdentifier != window.owningApplication?.bundleIdentifier } let filter = SCContentFilter(display: display, including: availableWindows) if #available(macOS 14.0, *) { let image = try? await SCScreenshotManager.captureImage( contentFilter: filter, configuration: SCStreamConfiguration.defaultConfig( width: display.width, height: display.height ) ) return image } else { return nil } } extension SCStreamConfiguration { static func defaultConfig(width: Int, height: Int) -> SCStreamConfiguration { let config = SCStreamConfiguration() config.width = width config.height = height config.showsCursor = false if #available(macOS 14.0, *) { config.captureResolution = .best } return config } }
1
0
622
Oct ’23
SCStream.startCapture fails with Error received from the remote queue -16665
Trying to integrate the new screencapturekit into our application. The stand alone test we made works fine, however when integrated, when we start the stream capture we get this error in the logs (ScreenCaptureKit) [ERROR] _SCStream_RemoteAudioQueueOperationHandlerWithError:1032 Error received from the remote queue -16665 Any insights what might be causing this? this is what we're passing addStreamOutput private let sampleQueue = DispatchQueue(label: Bundle.main.bundleIdentifier! + ".SampleQueue") self.stream = SCStream(filter: filter, configuration: self.streamConfig, delegate: self) do { try self.stream?.addStreamOutput(self, type: .screen, sampleHandlerQueue: self.sampleQueue) } We have the whole handlers and what not, pretty much verbatim from the apple provided sample
0
1
446
Oct ’23
iOS 17 RPSystemBroadcastPickerView not working
My existing code is working properly in iOS < 17 devices it records the iPhone screen and records audio as well simultaneously, but in iOS 17 devices the screen recording video is captured for only 2 seconds and then stops automatically, As its an extension, i don't have logs to debug the issue. I have tested the same code in other iPhones and OS less than 17, its working fine but in iOS 17 devices this issue is coming. @try { NSLog(@“initAssesWriter”); NSError *error = nil; CGRect screenRect = [[UIScreen mainScreen] bounds]; _videoWriter = [[AVAssetWriter alloc] initWithURL: _filePath fileType:AVFileTypeMPEG4 error:&error]; NSParameterAssert(_videoWriter); //Configure video NSDictionary* videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithDouble:2048*1024.0], AVVideoAverageBitRateKey, nil ]; NSDictionary* videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecTypeH264, AVVideoCodecKey, [NSNumber numberWithInt:screenRect.size.width * 4], AVVideoWidthKey, [NSNumber numberWithInt:screenRect.size.height * 4], AVVideoHeightKey, videoCompressionProps, AVVideoCompressionPropertiesKey, nil]; _writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] ; _writerInput.expectsMediaDataInRealTime = YES; NSParameterAssert(_writerInput); NSParameterAssert([_videoWriter canAddInput:_writerInput]); [_videoWriter addInput:_writerInput]; AudioChannelLayout acl; bzero( &acl, sizeof(acl)); acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono; NSDictionary* audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys: [ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey, [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey, [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey, [ NSData dataWithBytes: &acl length: sizeof( AudioChannelLayout ) ], AVChannelLayoutKey, [ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey, nil]; _audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeAudio outputSettings: audioOutputSettings ]; _audioWriterInput.expectsMediaDataInRealTime = YES; // seems to work slightly better NSParameterAssert(_audioWriterInput); NSParameterAssert([_videoWriter canAddInput:_audioWriterInput]); [_videoWriter addInput:_audioWriterInput]; [_videoWriter setMovieFragmentInterval:CMTimeMake(1, 600)]; [_videoWriter startWriting]; } @catch (NSException *exception) { } @finally { } -(void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType{ @try { if(!_isRecordingStarted){ [_videoWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)]; _isRecordingStarted = YES; [self saveFlurryLogs:@"Assest writer Start Recording" Details:@""]; NSLog(@"CMSampleBufferGetPresentationTimeStamp"); } } @catch (NSException *exception) { [self saveFlurryLogs:@"Recording Start Execption" Details:exception.description]; } @finally { } @try { switch (sampleBufferType) { case RPSampleBufferTypeVideo: // Handle video sample buffer if([_writerInput isReadyForMoreMediaData]){ [_writerInput appendSampleBuffer:sampleBuffer]; NSLog(@"writing matadata Video"); } break; case RPSampleBufferTypeAudioApp: // Handle audio sample buffer for app audio break; case RPSampleBufferTypeAudioMic: if([_audioWriterInput isReadyForMoreMediaData]){ [_audioWriterInput appendSampleBuffer:sampleBuffer]; NSLog(@"writing matadata Audio"); } // Handle audio sample buffer for mic audio break; default: break; } } @catch (NSException *exception) { [self saveFlurryLogs:@"Packet Write Execption" Details:exception.description]; } @finally { } }
1
1
675
Oct ’23
CGWindowListCreateImage -> ScreenCaptureKit
So I'm building a colour sampler tool, similar to ColorSlurp and since CGWindowListCreateImage (which I am using for below macOS 14.0) is deprecated since Sonoma, I am wondering what's the best approach to replacing it. The way I use CGWindowListCreateImage currently is to take a screenshot of a specified area around the mouse pointer every time the mouse moves. Which works perfectly fine without issues. Now I've tried replacing CGWindowListCreateImage with SCScreenshotManager.createImage which is an async function and as you might expect, running async functions on mouse movements doesn't quite work out that well. It is lagging behind, heavily. So my question would be what's the appropriate ScreenCaptureKit methods to replace the functionality already created with CGWindowListCreateImage? Should I create a SCStream instead? Then I am worried about the fact that I would need to stream the whole screen instead of just the area around the mouse pointer since updating stream configs is an async function as well. I'd greatly appreciate any sort of direction!
1
2
818
May ’24