I need to read results of a test in Xcode post-action of Test.
So the script that runs after tests contains
SUMMARY=$(xcrun xcresulttool get test-results summary --path $RESULT_FILE 2>> "$LOGS")
If I run my script from terminal, everything is ok but if I launch tests and the post-action is executed, I got
Warning: unknown environment variable SWIFT_DEBUG_INFORMATION_FORMAT
Warning: unknown environment variable SWIFT_DEBUG_INFORMATION_VERSION
Warning: unknown environment variable SWIFT_DEBUG_INFORMATION_FORMAT
Warning: unknown environment variable SWIFT_DEBUG_INFORMATION_VERSION
Error: Failed to create a new result bundle reader, underlying error: failed to read metadata with underlying error (type: FileSystemError: 3 - [:] - The operation couldn’t be completed. (MinimalTSCBasic.FileSystemError error 3.)
Usage: xcresulttool <subcommand>
See 'xcresulttool --help' for more information.
It might be that the environment set by Xcode is missing something but I cannot figure out what, any idea?
Xcode
RSS for tagBuild, test, and submit your app using Xcode, Apple's integrated development environment.
Post
Replies
Boosts
Views
Activity
I encountered an issue when implementing the WKUIDelegate protocol and NSItemProviderWriting protocol. Below is a minimal example of the code that reproduces the issue:
import UIKit
import WebKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
}
class ItemProvider: NSObject, NSItemProviderWriting {
static var writableTypeIdentifiersForItemProvider: [String] {
fatalError()
}
func loadData(
withTypeIdentifier typeIdentifier: String,
forItemProviderCompletionHandler completionHandler: @escaping @Sendable (Data?, (any Error)?) -> Void
) -> Progress? {
fatalError()
}
}
extension ViewController: WKUIDelegate {
func webView(
_ webView: WKWebView,
runJavaScriptAlertPanelWithMessage message: String,
initiatedByFrame frame: WKFrameInfo,
completionHandler: @escaping @MainActor @Sendable () -> Void
) {
fatalError()
}
}
When implementing the WKUIDelegate protocol after NSItemProviderWriting protocol, the following warning appears:
Instance method 'webView(_:runJavaScriptAlertPanelWithMessage:initiatedByFrame:completionHandler:)' nearly matches optional requirement 'webView(_:runJavaScriptAlertPanelWithMessage:initiatedByFrame:completionHandler:)' of protocol 'WKUIDelegate'
The target's SWIFT_VERSION is 6.0.
I would like to develop a visionOS app for Apple Vision Pro that includes a group video calling feature. My app needs to support communication not only between other Vision Pro users but also with users on mobile apps or web-based platforms.
Is the ACS SDK compatible with visionOS? If not, are there other communication SDKs or APIs that support integrating video calling across different platforms (visionOS, iOS, web)?
Any insights on handling group video calls in a spatial computing environment would be greatly appreciated.
Thank you!
While working X code, got the error message in signing and capabilities "Communication with Apple Failed". What does this mean and how can it be fixed?
It appears that AVAudioPlayer is maintaining a strong reference to my containing class. Here is the essential code. Pay attention to the comments.
class StethRecording: NSObject, ObservableObject, Identifiable {
let player: AVAudioPlayer?
let id = UUID()
@Published var isPlaying = false
@Published var progress = 0.0
init(file: AVAudioFile) throws {
player = try AVAudioPlayer(contentsOf: file.url)
super.init()
// I used to assign the player delegate here.
// If I do that, when I delete this object, it
// doesn't go away.
player!.prepareToPlay()
}
deinit {
// If this object doesn't go away, I leave data.
// behind. Something I don't want to do.
try? deleteAssociatedAudioFile()
}
func play() {
guard let player else { return }
// So now I have to assign the delegate whenever
// I start playing.
player.delegate = self
isPlaying = true
player.play()
startUpdateTimer()
}
func stop() {
guard let player else { return }
player.stop()
playbackConcluded()
}
// MARK: - Private Methods
private func playbackConcluded() {
isPlaying = false
stopUpdateTimer()
updateProgress()
player!.reset()
// I also have to remove the delegate when I
// stop, for any reason.
player!.delegate = nil
player!.prepareToPlay()
}
}
extension StethRecording: AVAudioPlayerDelegate {
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
playbackConcluded()
}
}
This works, but is this approach really necessary? I would expect the AVAudioPlayer to use a weak reference for the delegate. Or, am I doing something else wrong here?
I successfully archived an app with Xcode 14.3.1. I air dropped it onto my iPhone 13. But when I tried to run the app by tapping the app icon the opening screen just flashes but does not allow to run the app. The same thing happens on my I pad. Any suggestions how to locate the reason for this behavior will be appreciated.
I created an Intent-based widget for my iOS app. The deployment target is iOS 16, but I stick to Intent vs AppIntent because I don't like how the configuration list pops up vs the list with search in the old API.
So, the configuration works fine on iOS 17/18, but iOS 16 shows the error "unable to load" or just a blank view.
This is how it looks on iOS 17/18
I don't see any specific errors in a console or warnings.
I updated Xcode and MacOS recently and haven't been able to compile my Flutter app on iOS devices/simulators since then. The error keeps changing every time I run it but here's one of the output in the terminal after running flutter run:
Uncategorized (Xcode): Command SwiftGeneratePch emitted errors but did not return a nonzero exit code to
indicate failure
Error (Xcode): no such file or directory:
'/Users/chiragbhansali/Library/Developer/Xcode/DerivedData/ModuleCache.noindex/Session.modulevalidation'
Error (Xcode): stat cache file
'/Users/chiragbhansali/Library/Developer/Xcode/DerivedData/SDKStatCaches.noindex/iphonesimulator18.0-22A3362-
db63dc9361471f152f572502bdbfe70a.sdkstatcache' not found
Error (Xcode): unable to rename temporary
'/Users/chiragbhansali/Library/Developer/Xcode/DerivedData/ModuleCache.noindex/3F9VRK3CXAUUD/UIKit-1KHQ7M05IF
VXC-56601391.pcm.tmp' to output file
'/Users/chiragbhansali/Library/Developer/Xcode/DerivedData/ModuleCache.noindex/3F9VRK3CXAUUD/UIKit-1KHQ7M05IF
VXC.pcm': 'No such file or directory'
Error (Xcode): could not build module 'UIKit'
/Users/chiragbhansali/Chirag/Coding/Projects/app/test/build/ios/Debug-iphonesimulator/Flutter.framewo
rk/Headers/FlutterAppDelegate.h:7:8
Error (Xcode): could not build module 'Flutter'
/Users/chiragbhansali/Chirag/Coding/Projects/app/test/ios/Runner/GeneratedPluginRegistrant.h:9:8
Error (Xcode): failed to emit precompiled header
'/Users/chiragbhansali/Library/Developer/Xcode/DerivedData/Runner-eifcguceazlwumgsyzegclqdrbqt/Build/Intermed
iates.noindex/PrecompiledHeaders/Runner-Bridging-Header-swift_PB6A5GFLTNPC-clang_3F9VRK3CXAUUD.pch' for
bridging header
'/Users/chiragbhansali/Chirag/Coding/Projects/app/test/ios/Runner/Runner-Bridging-Header.h'
Uncategorized (Xcode): Command PrecompileSwiftBridgingHeader emitted errors but did not return a nonzero exit
code to indicate failure
Uncategorized (Xcode): Command SwiftEmitModule failed with a nonzero exit code
Uncategorized (Xcode): Command SwiftCompile failed with a nonzero exit code
Could not build the application for the simulator.
Error launching application on iPhone 16 Pro.
What I have tried so far:
Deleting iOS SDK and simulators
Cleaning Xcode build cache using cmd+shift+k
Creating a new Flutter project and trying to compile it (it failed so that reduces the chances of it being a Flutter issue)
Clearing the DerivedData folder
Clearing settings and cache of simulators
Restarting laptop
Versions:
Xcode: 16.0
iOS: 18.0
MacOS: 15.1 (didn't work with 15.0 either)
Flutter: 3.24
Apple M3 Pro chip 14" Macbook Pro (Nov 2023)
XCode Version 16.0 (16A242d)
macOS 14.7
NetworkLinkConditioner details: created 9th Aug 2024, source version 93000000000000, build version 2612, CFBundleVersion 2.0
When I go to the Network Link Conditioner panel page, and try to switch it on, the screen just goes blank, or the button freezes, or it moves to On but then the interface freezes before going blank but doesn't apply the conditions I've selected. The panel then won't reopen if I go to another settings panel and then click back on it, unless I restart the system settings.
I had it working a few weeks ago, and what has changed since then was that I upgraded from an older version of macOS 14 to 14.7 and upgraded from Xcode 15 to Xcode 16.
What I've tried: I downloaded the xcode 16 tools, right clicked on the existing panel to remove it, and then installed the panel again from the hardware folder in the download, but unfortunately this didn't help, even after restarting the macbook.
I have the newest Xcode and macOS.
When I'm trying to check Pull Request changes then when I click on any changed/added file, then instead of view with file I see infinite spinner.
This happened in all files.
I've tried to reinstall Xcode, macOS, clone new repo.
This feature works once per 10-20 tries.
Also when I try to manually compare uncommitted changes to some commit in Code Review mode, then I don't see any changes and no error appears.
When I use git command line command in terminal, everything works and I receive changes.
What can I do to fix this problem?
I've tried Xcode 16.1 beta, and I can't go back to Xcode 15.
Anyone has this problem?
Anyone has any ideas?
I am getting this error while archiving the app. This is happening after upgrading to the latest macOS Sequoia and XCode 16.
getting this error.
env: node: No such file or directory
Command PhaseScriptExecution failed with a nonzero exit code
It was working when I was in XCode 15.4.
Hi, Im using Xcode to test ML model performance. When I create a performance report on my Mac, I can get the performance report with prediction, load and Compilation time in ms. But when I tried to test the performance on iPhone16pro, although the test came to an end normally, but the prediction/load/compilation time are always 0, and there were no Compute Unit Mapping below.
Thanks for your help.
I made an update to my app's code to make use of the new Contact access limited permission (CNAuthorizationStatusLimited), like so:
if (@available(iOS 18.0, *)) {
switch ([CNContactStore authorizationStatusForEntityType:CNEntityTypeContacts]) {
case CNAuthorizationStatusLimited:
<snip>
However, later I discovered there's another totally unrelated issue which only manifests when the app is built with XCode 16. It isn't a trivial change to workaround, so for now I would like to make a release to the app store which makes use of the new CNAuthorizationStatusLimited status but is built using XCode 15.
However, building with XCode 15 results in a "Use of undeclared identifier 'CNAuthorizationStatusLimited' error.
If the code were making use of a new API, I could workaround using a selector for example, however as this is an enum, is there Is there any workaround possible for this - or its just not possible to build using XCode 15 and the source code contain references to CNAuthorizationStatusLimited?
I am very new to IOS development and currently working on network extension. I have been trying to fetch parent process ID using control filters. Is there a way to fetch it using controlFilters or dataFilters?
When I try to download it I receive this error:
The operation couldn’t be completed. (ModelCatalog.CatalogErrors.AssetErrors error 1.)
Domain: ModelCatalog.CatalogErrors.AssetErrors
Code: 1
User Info: {
DVTErrorCreationDateKey = "2024-10-23 02:10:29 +0000";
}
Failed to find asset: com.apple.fm.code.generate_small_v1.tokenizer - no asset
Domain: ModelCatalog.CatalogErrors.AssetErrors
Code: 1
System Information
macOS Version 15.1 (Build 24B82)
Xcode 16.0 (23051) (Build 16A242d)
Timestamp: 2024-10-22T23:10:29-03:00
I've already tried changing Wi-Fi networks and restarting my Mac, what can I try?
Thanks for any help
Hi,
I am currently running XCode 14.3 on a 2015 Macbook using MacOS 12.6
I built an app and then was able to deploy to App Store Connect where I get this error:
SDK version issue. This app was built with the iOS 16.2 SDK. All iOS and iPadOS apps must be built with the iOS 17 SDK or later, included in Xcode 15 or later, in order to be uploaded to App Store Connect or submitted for distribution. (ID: 9431bf9a-6b21-4270-932d-d01b23a47691)
To distribute my app, I need iOS 17. To have iOS 17 SDK, I need XCode 15 or later. To get XCode 15 or later, I need macOS 14.5 To get macOS 14.5, I would need a new machine since 14.5 is not supported for 2015 macbooks
Does anyone have any suggestions on how I can publish my app that don't require a new macbook for this situation?
Here is some code I have to create an AVAudioFile instance based on Int16 samples.
let format = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 44100.0, channels: 2, interleaved: false)!
let audioFile = try AVAudioFile(forWriting: outputURL, settings: format.settings)
When writing to the file I get the following runtime error, presumably from CoreAudio.
CABufferList.h:184 ASSERTION FAILURE [(nBytes <= buf->mDataByteSize) != 0 is false]:
I read this as a size mismatch between what is specified in the format used to create the file and the file's own internal processingFormat property, which is read-only. Here is my debugger console output showing the input format I created, along with the resulting AVAudioFile fileFormat and processingFormat properties.
(lldb) po format
<AVAudioFormat 0x300e553b0: 2 ch, 44100 Hz, Int16, deinterleaved>
(lldb) po format.settings
▿ 7 elements
▿ 0 : 2 elements
- key : "AVNumberOfChannelsKey"
- value : 2
▿ 1 : 2 elements
- key : "AVLinearPCMBitDepthKey"
- value : 16
▿ 2 : 2 elements
- key : "AVFormatIDKey"
- value : 1819304813
▿ 3 : 2 elements
- key : "AVLinearPCMIsNonInterleaved"
- value : 1
▿ 4 : 2 elements
- key : "AVLinearPCMIsBigEndianKey"
- value : 0
▿ 5 : 2 elements
- key : "AVLinearPCMIsFloatKey"
- value : 0
▿ 6 : 2 elements
- key : "AVSampleRateKey"
- value : 44100
(lldb) po audioFile.fileFormat
<AVAudioFormat 0x300ea5400: 2 ch, 44100 Hz, Int16, interleaved>
(lldb) po audioFile.processingFormat
<AVAudioFormat 0x300ea5450: 2 ch, 44100 Hz, Float32, deinterleaved>
Please note that the input format I'm using does not match either the audio file fileFormat or processingFormat properties. The file format is interleaved even though I specified de-interleaved. This makes sense to me as working with audio files that are growing is much easier and more efficient with interleaved data. The head-scratcher is the processingFormat. I specified Int16 samples and it is expecting Float32? According to the format settings dictionary, we are specifying the correct key/value pairs.
Is this expected behavior? Does Apple always insist on Float32 internally or is this a bug?
Gradients with colors that have alpha are not rendered correctly anymore. I made a simple project to illustrate that. Just create Objective C project and paste this code inside the ViewController.
`#import <UIKit/UIKit.h>
@interface CustomView : UIView
@property (nonatomic, strong) NSArray<NSNumber *> *colorsArray; // The color components array
// Custom initializer that accepts an NSArray of color components
(instancetype)initWithFrame:(CGRect)frame colors:(NSArray<NSNumber *> *)colorsArray;
@end
@implementation CustomView
// Custom initializer
(instancetype)initWithFrame:(CGRect)frame colors:(NSArray<NSNumber *> *)colorsArray {
self = [super initWithFrame:frame];
if (self) {
_colorsArray = colorsArray; // Store the colors array
}
return self;
}
(void)drawRect:(CGRect)rect {
// Get the current context
CGContextRef context = UIGraphicsGetCurrentContext();
// Convert NSArray to a C-style array of CGFloats
size_t count = self.colorsArray.count;
CGFloat colors[count];
for (size_t i = 0; i < count; i++) {
colors[i] = [self.colorsArray[i] floatValue];
}
// Create a color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create the gradient with the passed colors
CGGradientRef gradient = CGGradientCreateWithColorComponents(colorSpace, colors, NULL, count / 4);
// Define the start and end points of the gradient
CGPoint startPoint = CGPointMake(rect.origin.x, rect.origin.y);
CGPoint endPoint = CGPointMake(rect.origin.x, rect.origin.y + rect.size.height);
// Draw the rectangle with the gradient
CGContextSaveGState(context);
CGContextAddRect(context, rect);
CGContextClip(context);
CGContextDrawLinearGradient(context, gradient, startPoint, endPoint, 0);
CGContextRestoreGState(context);
// Release resources
CGGradientRelease(gradient);
CGColorSpaceRelease(colorSpace);
}
@end
@interface ViewController : UIViewController
@end
@implementation ViewController
(void)viewDidLoad {
[super viewDidLoad];
// Get the screen bounds
CGRect screenBounds = [UIScreen mainScreen].bounds;
// Define the size of each custom view
CGFloat customViewWidth = screenBounds.size.width * 0.75;
CGFloat customViewHeight = screenBounds.size.height * 0.75;
// Define a dynamic set of colors for the first custom view (red to blue)
NSArray<NSNumber *> *colorsArray1 = @[
@1.0, @0.0, @0.0, @1.0, // Red
@0.0, @0.0, @1.0, @1.0 // Blue
];
// TODO: This is the bug, there is no transparency
NSNumber *alpha = @0.0;// this is the ***** bug ****
// Define a dynamic set of colors for the second custom view (green to yellow)
NSArray<NSNumber *> *colorsArray2 = @[
@0.0, @1.0, @0.0, alpha, // Green
@1.0, @1.0, @0.0, @1.0 // Yellow
];
// Calculate the position for the first view (centered horizontally and vertically, with slight offset)
CGRect frame1 = CGRectMake((screenBounds.size.width - customViewWidth) / 2 - customViewWidth * 0.25,
(screenBounds.size.height - customViewHeight) / 2 - customViewHeight * 0.25,
customViewWidth, customViewHeight);
CustomView *customView1 = [[CustomView alloc] initWithFrame:frame1 colors:colorsArray1];
// Calculate the position for the second view (slightly shifted from the first view to partially overlap)
CGRect frame2 = CGRectMake((screenBounds.size.width - customViewWidth) / 2 + customViewWidth * 0.25,
(screenBounds.size.height - customViewHeight) / 2 + customViewHeight * 0.25,
customViewWidth, customViewHeight);
CustomView *customView2 = [[CustomView alloc] initWithFrame:frame2 colors:colorsArray2];
// Set autoresizing so they adjust with the screen size
customView1.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight;
customView2.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight;
// Add the custom views to the view controller's view
[self.view addSubview:customView1];
[self.view addSubview:customView2];
}
@end`
I installed a custom font (Font awesome) into my app. I triple checked that I did everything right: the font files are included in the bundle (they appear in the "Copy Bundle Resources" build phase) and the names of the fonts appear in the Info.plist file under "Fonts provided by application".
In Interface builder, I select a Label, set the font to "Custom", then I click the Family list to select the font I want.
Once or twice, I was actually able to see the Font Awesome fonts in this list and select one. However, they no longer appear there when I create new labels in new views. I do not understand why. I've been limping along by copying a label from one of the views where it worked and pasting it into the new view, but this is tiresome.
I know the fonts are installed correctly because I can see them when I run the app.
Why are the fonts not showing up on the font list in interface builder?
Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[NSSet intersectsSet:]: set argument is not an NSSet'