I'm creating app that listening other app's sound. in this use case, screen data is not needed.
but if I don't call SCStream#addStreamOutput(_, type: .screen, ...), console shows this error:
[ERROR] _SCStream_RemoteVideoQueueOperationHandlerWithError:701 stream output NOT found. Dropping frame
currently I'm setting SCStreamConfiguration#minimumFrameInterval to large value (e.g. 0.1fps) as workaround, but it would be good if i can completely disable screen capture for best performance.
there is any way to disable screen capture and only captures apps audio?
ScreenCaptureKit
RSS for tagScreenCaptureKit brings high-performance screen capture, including audio and video, to macOS.
Posts under ScreenCaptureKit tag
37 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Can I officially use the ScreenCaptureKit framework without worrying that it contains some bugs or may be exposed to future changes?
I'm getting this error when I try to run the ScreenCaptureKit sample project on macOS Sonoma:
[ERROR] _SCStream_RemoteAudioQueueOperationHandlerWithError:1,053 streamOutput NOT found. Dropping frame
[ERROR] _SCStream_RemoteVideoQueueOperationHandlerWithError:1,020 stream output NOT found. Dropping frame
Both streamOutput are being set like this:
try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue)
try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue)
Link to sample project https://developer.apple.com/documentation/screencapturekit/capturing_screen_content_in_macos
Any idea of what is causing this?
SCFilter in ScreenCaptureKit has 5 methods to select the capture target.
Only one method initWithDesktopIndependentWindow: accepts one SCWindow as its capture target.
The other four methods target one SCDisplay and extra SCRunningApplication or SCWindow combinations to filter out items by their criteria.
Compared to the old API to capture windows or displays snapshots:
CGImageRef CGWindowListCreateImage(CGRect screenBounds, CGWindowListOption listOption, CGWindowID windowID, CGWindowImageOption imageOption);
CGImageRef CGWindowListCreateImageFromArray(CGRect screenBounds, CFArrayRef windowArray, CGWindowImageOption imageOption);
The second method accepts an array of CGWindowID, so the result is a composite image formed from these windows.
Will Apple provide a substitute in SCFilter for this method in the future?
i saw SC kit wwdc 23 video, just awesome!!. any sample project available with latest update feature for reference to make my recording app
Hi!
I am building a Mac swift app to make screen recordings. I want to be able to record all the screen, a window or an area of the screen.
I am very new to swift and am not sure if I should be using AVCaptureSession or SCScreenCaptureSession.
I am trying to understand the differences and when I should use one over the other can anyone point me in the right direction?
Thanks!
Get screen recording permission window show every time when I run the ScreenCaptureKit sample project on macOS Sonoma Beta5:
Even though I've already provided permissions in the screen recording permissions:
This box keeps popping up, and until I remove the old recorded "CaptureSample" screen-recording permissions in "System Settings->Screen Recording" and add them again, it won't be able to record.
This issue happens many times a day!
Does anyone know how to fix it?
Thanks~
Hey All,
Do we have any Solution for screen control of iOS devices from Mac, like Vysor for android? Appreciate your help here. Thanks!
Hey, how can I retrieve dirtyRect from SCScreenshotManager.captureSampleBuffer?
With streaming method, I can get it via attachments, but attachments are missing when using the SCScreenshotManager.
If I have to compute it manually, would would be the most performant way of doing so?
I want to develop the application where I can able to share screen (display)from iphone(or from iphone to iphone) for remote support(or any other method) for this this I need screen captute application or API or system call. If any app is there in iphone or app store or any screen sharing client which can give me screen capture support.
I am new to ios. I want to develop the application where I can able to share screen (display)from iphone(or from iphone to iphone) for remote support.
For this this I need screen capture application or want to develop application .Can any apple developer help me on this.I want to chat and connect with Apple developer. If any freelancer is there then please connect with me.
Hey team -- looking to receive a delegate callback in Sonoma for screencapturekit when the toolbar portion of stop sharing is called. Does this come through in any of these?
/**
@abstract stream :didStopStreamWithError:
@param stream the SCStream object
@param error the error denoted by the stopping of the stream
@discussion notifies the delegate that the stream has stopped and the error associated with it
*/
optional func stream(_ stream: SCStream, didStopWithError error: Error)
/**
@abstract outputVideoEffectDidStartForStream:
@param stream the SCStream object
@discussion notifies the delegate that the stream's overlay video effect has started.
*/
@available(macOS 14.0, *)
optional func outputVideoEffectDidStart(for stream: SCStream)
/**
@abstract stream:outputVideoEffectDidStart:
@param stream the SCStream object
@discussion notifies the delegate that the stream's overlay video effect has stopped.
*/
@available(macOS 14.0, *)
optional func outputVideoEffectDidStop(for stream: SCStream)
I'm encountering performance degradation with my application that utilizes ScreenCaptureKit. Even after explicitly disabling App Nap using the NSAppSleepDisabled key, the problem persists.
My application, which relies heavily on ScreenCaptureKit for its core functionality, experiences significant performance drops after running for a short period. When I click on the application, the performance momentarily returns to normal but quickly deteriorates again.
I've checked for memory leaks in my application and haven't found any issues in that regard.
Has anyone experienced similar performance issues with ScreenCaptureKit? I'm keen to know if there are any known bugs or workarounds to mitigate this problem.
I work on a screen recorder app and having issues with the new presenter overlay mode on macOS 14. Switching to the "Small" overlay is fine, but switching to the "Large" overlay mode causes our AVAssetWriter to fail every time with the following error:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-16364), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x6000028729a0 {Error Domain=NSOSStatusErrorDomain Code=-16364 "(null)"}}
which doesn't provide any helpful information. I'm not sure if we're doing something wrong, but I've tried to reduce our code as much as possible and still get the crash. I'm not sure if anyone has any clues or is experiencing the same thing?
Alternatively, is there a way to disable presenter overlay until it's fixed? Our app displays a camera and uses ScreenCaptureKit to record the screen along with the camera, which automatically enables presenter overlay options. I can't find any way to opt-out or turn off the presenter overlay options which is a bummer. That seems like it should be controllable from either the AVCaptureSession or SCStreamConfiguration
I am using ScreenCaptureKit to create a screenshot software,
but I found that the screenshot captured by the new API, SCScreenshotManager.captureImage, is very blurry.
This is my screenshot. It is so blurry.
But I hope it's like this.
My code is as follows.
func captureScreen(windows: [SCWindow], display: SCDisplay) async throws -> CGImage? {
let availableWindows = windows.filter { window in
Bundle.main.bundleIdentifier != window.owningApplication?.bundleIdentifier
}
let filter = SCContentFilter(display: display, including: availableWindows)
if #available(macOS 14.0, *) {
let image = try? await SCScreenshotManager.captureImage(
contentFilter: filter,
configuration: SCStreamConfiguration.defaultConfig(
width: display.width,
height: display.height
)
)
return image
} else {
return nil
}
}
extension SCStreamConfiguration {
static func defaultConfig(width: Int, height: Int) -> SCStreamConfiguration {
let config = SCStreamConfiguration()
config.width = width
config.height = height
config.showsCursor = false
if #available(macOS 14.0, *) {
config.captureResolution = .best
}
return config
}
}
Trying to integrate the new screencapturekit into our application.
The stand alone test we made works fine, however when integrated, when we start the stream capture we get this error in the logs
(ScreenCaptureKit) [ERROR] _SCStream_RemoteAudioQueueOperationHandlerWithError:1032 Error received from the remote queue -16665
Any insights what might be causing this?
this is what we're passing addStreamOutput
private let sampleQueue = DispatchQueue(label: Bundle.main.bundleIdentifier! + ".SampleQueue")
self.stream = SCStream(filter: filter, configuration: self.streamConfig, delegate: self)
do {
try self.stream?.addStreamOutput(self, type: .screen, sampleHandlerQueue: self.sampleQueue)
}
We have the whole handlers and what not, pretty much verbatim from the apple provided sample
My existing code is working properly in iOS < 17 devices it records the iPhone screen and records audio as well simultaneously, but in iOS 17 devices the screen recording video is captured for only 2 seconds and then stops automatically, As its an extension, i don't have logs to debug the issue.
I have tested the same code in other iPhones and OS less than 17, its working fine but in iOS 17 devices this issue is coming.
@try {
NSLog(@“initAssesWriter”);
NSError *error = nil;
CGRect screenRect = [[UIScreen mainScreen] bounds];
_videoWriter = [[AVAssetWriter alloc] initWithURL:
_filePath fileType:AVFileTypeMPEG4
error:&error];
NSParameterAssert(_videoWriter);
//Configure video
NSDictionary* videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:2048*1024.0], AVVideoAverageBitRateKey,
nil ];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecTypeH264, AVVideoCodecKey,
[NSNumber numberWithInt:screenRect.size.width * 4], AVVideoWidthKey,
[NSNumber numberWithInt:screenRect.size.height * 4], AVVideoHeightKey,
videoCompressionProps, AVVideoCompressionPropertiesKey,
nil];
_writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] ;
_writerInput.expectsMediaDataInRealTime = YES;
NSParameterAssert(_writerInput);
NSParameterAssert([_videoWriter canAddInput:_writerInput]);
[_videoWriter addInput:_writerInput];
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary* audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSData dataWithBytes: &acl length: sizeof( AudioChannelLayout ) ], AVChannelLayoutKey,
[ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
nil];
_audioWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType: AVMediaTypeAudio
outputSettings: audioOutputSettings ];
_audioWriterInput.expectsMediaDataInRealTime = YES; // seems to work slightly better
NSParameterAssert(_audioWriterInput);
NSParameterAssert([_videoWriter canAddInput:_audioWriterInput]);
[_videoWriter addInput:_audioWriterInput];
[_videoWriter setMovieFragmentInterval:CMTimeMake(1, 600)];
[_videoWriter startWriting];
} @catch (NSException *exception) {
} @finally {
}
-(void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType{
@try {
if(!_isRecordingStarted){
[_videoWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
_isRecordingStarted = YES;
[self saveFlurryLogs:@"Assest writer Start Recording" Details:@""];
NSLog(@"CMSampleBufferGetPresentationTimeStamp");
}
} @catch (NSException *exception) {
[self saveFlurryLogs:@"Recording Start Execption" Details:exception.description];
} @finally {
}
@try {
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
// Handle video sample buffer
if([_writerInput isReadyForMoreMediaData]){
[_writerInput appendSampleBuffer:sampleBuffer];
NSLog(@"writing matadata Video");
}
break;
case RPSampleBufferTypeAudioApp:
// Handle audio sample buffer for app audio
break;
case RPSampleBufferTypeAudioMic:
if([_audioWriterInput isReadyForMoreMediaData]){
[_audioWriterInput appendSampleBuffer:sampleBuffer];
NSLog(@"writing matadata Audio");
}
// Handle audio sample buffer for mic audio
break;
default:
break;
}
} @catch (NSException *exception) {
[self saveFlurryLogs:@"Packet Write Execption" Details:exception.description];
} @finally {
}
}
So I'm building a colour sampler tool, similar to ColorSlurp and since CGWindowListCreateImage (which I am using for below macOS 14.0) is deprecated since Sonoma, I am wondering what's the best approach to replacing it.
The way I use CGWindowListCreateImage currently is to take a screenshot of a specified area around the mouse pointer every time the mouse moves. Which works perfectly fine without issues.
Now I've tried replacing CGWindowListCreateImage with SCScreenshotManager.createImage which is an async function and as you might expect, running async functions on mouse movements doesn't quite work out that well. It is lagging behind, heavily.
So my question would be what's the appropriate ScreenCaptureKit methods to replace the functionality already created with CGWindowListCreateImage? Should I create a SCStream instead? Then I am worried about the fact that I would need to stream the whole screen instead of just the area around the mouse pointer since updating stream configs is an async function as well.
I'd greatly appreciate any sort of direction!
The old one is "x-apple.systempreferences:com.apple.preference.security?Privacy_ScreenCapture"