Discuss Swift.

Swift Documentation

Post

Replies

Boosts

Views

Activity

Double value cannot be converted to UInt8 because the result would be less than UInt8.min
I have a swift callback function that accumulates data from the gyroscope handler CMGyroData to value, and takes care that it never goes below zero. Any thoughts why I got this crash? var value: Double = 90 func test(d: Double) { value -= d if value < 0 { value = 0 } else if value > 180 { value = 180 } // Double value cannot be converted to UInt8 because the result would be less than UInt8.min // according to Xcode, value is 0 let angle = UInt8(value.rounded()) print(angle) } During the crash, the debugger shows that value is 0, and the raw memory is 0x0000000000000000. I've heard about negative zero, but then the raw memory would be 0x8000000000000000. Either way, it makes no sense for the UInt8 constructor to fail. (lldb) p value (Double) 0 (lldb) p value.rounded() (Double) 0 (lldb) p UInt8(value.rounded()) (UInt8) 0 Xcode 15.1, iPhone 7 Plus, iOS 15.8 Edit: As a good measure, I changed that line to let angle = UInt8(Int(value.rounded()))
7
0
1.4k
Dec ’23
How to access a directly attached UVC camera with AVPlayer?
On macOS Sonoma I have a SwiftUI app that correctly plays remote video files and local video files from the app bundle. Where I'm having trouble is setting up the AVPlayer URL for a UVC camera device directly connected on the Mac. let url = URL(string: "https://some-remote-video.mp4")! player = AVPlayer(url: url) player.play() Is there some magic to using a UVC device with AVPlayer, or do I need to access the UVC device differently? Thanks, Grahm
1
0
750
Dec ’23
How to debugging EXC_BAD_ACCESS KERN_INVALID_ADDRESS
Hi all, I'm trying to find the reason why my app is crashing and fix it. This is just happening in some devices and others are working fine. I have no idea how to do it and any hint on what to do will be really appreciated. I am in a test stage using testFlight and I am using crashlytics to collect the errors. The apps has Firebase services (Push Notification, firestore, crashlytics) google maps, Realm and more. The error is below: EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000018 Crashed: com.apple.main-thread 0 myapp 0x3dcd4c gmscore::vector::TextureAtlasElement::height() const + 72916 1 myapp 0xe9e50 (anonymous namespace)::BubbleBehavior::Commit(gmscore::renderer::EntityRenderer*) + 4373995088 2 myapp 0x2d14ac gmscore::renderer::EntityRenderer::Draw(bool, bool) + 340 3 myapp 0x39d2dc -[GMSPhoenixRenderer drawForced:deferredPresentation:] + 13156 4 myapp 0x37ed2c -[GMSEntityRendererView draw] + 116072 5 myapp 0x3c2aa0 -[GMSVectorMapView draw] + 166696 6 myapp 0x37d140 -[GMSEntityRendererView displayLinkFired:] + 108924 7 myapp 0x37b4a8 -[GMSDisplayLink displayLinkFired:] + 101604 8 QuartzCore 0x2fa14 CA::Display::DisplayLinkItem::dispatch_(CA::SignPost::Interval<(CA::SignPost::CAEventCode)835322056>&) + 48 9 QuartzCore 0x32bfc CA::Display::DisplayLink::dispatch_items(unsigned long long, unsigned long long, unsigned long long) + 864 10 QuartzCore 0x32708 CA::Display::DisplayLink::callback(_CADisplayTimer*, unsigned long long, unsigned long long, unsigned long long, bool, void*) + 844 11 QuartzCore 0xb191c CA::Display::DisplayLink::dispatch_deferred_display_links(unsigned int) + 348 12 UIKitCore 0xaa48c _UIUpdateSequenceRun + 84 13 UIKitCore 0xa9b7c schedulerStepScheduledMainSection + 144 14 UIKitCore 0xa9c38 runloopSourceCallback + 92 15 CoreFoundation 0x3731c __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 16 CoreFoundation 0x36598 __CFRunLoopDoSource0 + 176 17 CoreFoundation 0x34d4c __CFRunLoopDoSources0 + 244 18 CoreFoundation 0x33a88 __CFRunLoopRun + 828 19 CoreFoundation 0x33668 CFRunLoopRunSpecific + 608 20 GraphicsServices 0x35ec GSEventRunModal + 164 21 UIKitCore 0x22c2b4 -[UIApplication _run] + 888 22 UIKitCore 0x22b8f0 UIApplicationMain + 340 23 UIKitCore 0x4559c8 __swift_destroy_boxed_opaque_existential_1Tm + 12220 24 myapp 0x34704 main + 4373251844 (AppDelegate.swift:4373251844) 25 ??? 0x1bb0badcc (Falta)
2
0
1.2k
Dec ’23
Using URLRequest with AVPlayer
I'm creating an app with a video player streaming video from an URL. The AVPlayer only takes the URL address of the video but I'd like to create a custom URLRequest and then use that to stream the video from online. So current situation is this: let videourl = URL(string: "videofoobar.com") let player = AVPlayer(url: videourl) player.play() And I would like to go to something like this: let videourl = URL(string: "videofoobar.com") var request = URLRequest(url: videourl) let player = AVPlayer(request: request) //This obviously fails, any workaround? player.play() I know it is not possible to do it like this as AVPlayer only takes URL or AVPlayerItem as parameter. Is there any workaround to make URLRequest and then give it to AVPlayer for online streaming?
0
0
565
Dec ’23
Error booting app with Transformer in SwiftData Class
Hi folks, I'm trying to use a Transformer in a SwiftData class, but everytime i boot the app in the visionOS simulator it shows an error in main. Error: Thread 1: Fatal error: Application must register a ValueTransformer for NSAttributedStringTransformer Here's my main file: @main struct TestApp: App { var body: some Scene { WindowGroup(id: "main") { InfoWindow() .modelContainer(for: Document.self) } .windowResizability(.contentSize) } } Here's the class file: import SwiftData @Model final class Document { @Attribute(.unique) var id: String @Attribute(.transformable(by: NSAttributedStringTransformer.self)) var content: NSAttributedString var name: String var history: [String] init(content: NSAttributedString, name: String, history: [String]) { self.id = UUID().uuidString self.content = content self.name = name self.history = history } } extension Document: Identifiable { } } And the Transformer: extension NSValueTransformerName { static let nsAttributedStringTransformer = NSValueTransformerName(rawValue: "NSAttributedStringTransformer") } @objc(NSAttributedStringTransformer) class NSAttributedStringTransformer: NSSecureUnarchiveFromDataTransformer { override class func allowsReverseTransformation() -> Bool { return true } override class var allowedTopLevelClasses: [AnyClass] { return [NSAttributedString.self] } override class func transformedValueClass() -> AnyClass { return NSAttributedString.self } override func reverseTransformedValue(_ value: Any?) -> Any? { guard let attributedString = value as? NSAttributedString else { return nil } return attributedString.toNSData() } override func transformedValue(_ value: Any?) -> Any? { guard let data = value as? NSData else { return nil } return data.toAttributedString() } } private extension NSData { func toAttributedString() -> NSAttributedString? { let options: [NSAttributedString.DocumentReadingOptionKey: Any] = [ .documentType: NSAttributedString.DocumentType.rtf, .characterEncoding: String.Encoding.utf8 ] return try? NSAttributedString(data: Data(referencing: self), options: options, documentAttributes: nil) } } private extension NSAttributedString { func toNSData() -> NSData? { let options: [NSAttributedString.DocumentAttributeKey: Any] = [ .documentType: NSAttributedString.DocumentType.rtf, .characterEncoding: String.Encoding.utf8 ] let range = NSRange(location: 0, length: length) guard let data = try? data(from: range, documentAttributes: options) else { return nil } return NSData(data: data) } } Been stuck in this issue for a while now and would appreciate if anyone with more experience could help!
0
0
381
Dec ’23
Xcode Instruments CPU Profiler not logging os_signpost Points of Interest
If I create a new project with the following code in main.swift and then Profile it in Instruments with the CPU Profiler template, nothing is logged in the Points of Interest category. I'm not sure if this is related to the recent macOS 14.2 update, but I'm running Xcode 15.1. import Foundation import OSLog let signposter = OSSignposter(subsystem: "hofstee.test", category: .pointsOfInterest) // os_signpost event #1 signposter.emitEvent("foo") // os_signpost event #2 signposter.withIntervalSignpost("bar") { print("Hello, World!") } If I change the template to the System Trace template and profile again, then the two os_signpost events show up as expected. This used to work before, and this is a completely clean Xcode project created from the macOS Command Line Tool template. I'm not sure what's going on and searching for answers hasn't been fruitful. Changing the Bundle ID doesn't have any effect either.
1
0
1.1k
Dec ’23
CPListImageRowItem customisation
Hi I was trying to design the above UI, But using the below code of CPListImageRowItem func templateApplicationScene(_ templateApplicationScene: CPTemplateApplicationScene, didConnect interfaceController: CPInterfaceController) { self.interfaceController = interfaceController // Create a list row item with images let item = CPListImageRowItem(text: "Category", images: [UIImage(named: "cover.jpeg")!, UIImage(named: "cover2.jpeg")!, UIImage(named: "discover.jpeg")!, UIImage(named: "thumbnail.jpeg")!]) // Create a list section let section = CPListSection(items: [item]) // Create a list template with the section let listTemplate = CPListTemplate(title: "Your Template Title", sections: [section]) // Set the template on the interface controller interfaceController.setRootTemplate(listTemplate, animated: true) } I was getting only header and below image items but detailed text under images are no way to set. can any one help me out of this
1
1
832
Dec ’23
iOS/iPadOS 17.2 Bug in PDFKit duplicates content of filled PDF form fields
The following situation is given: In your code you have a PDFDocument object. This contains a PDF file with form fields that are filled with text. If you call the dataRepresentation() method on this PDFDocument object, you get back a data object: let myDocumentData: Data = pdfDocument.dataRepresentation()! If you now want to initialize a new PDFDocument object with this data object, the contents of the form fields are duplicated within the PDF. let newPDF: PDFDocument = PDFDocument(data: myDocumentData) If you now want to print the newPDF PDFDocument object by creating a new print job, you will get the following result: What you actually expect to see: You only see this behavior when you want to print or share the PDF. You won't see this behaviour inside a PDF View in your application. This behaviour only appears since iOS/iPadOS 17.2 Steps to reproduce: Get a PDF file with form fields, especially text fields fill out these text fields with text create a PDFDocument object with this PDF file call the dataRepresentation() method on this PDFDocument object and store the result in a new variable create a new PDFDocument object with the data object created in the previous step:PDFDocument(data: <data>) Print the new created PDFDocument within iOS/iPadOS 17.2 or share it for example via email I hope Apple will fix this bug soon!
5
4
1.2k
Dec ’23
PDFKit since iPadOS 17.2: Annotations are scaled down when saving file
Dear Developer Community, My app is saving handwritten notes, forms and images as annotations in PDF documents by using PDFKit. Since iPadOS 17.2, the content of the annotations within the annotation boundaries is scaled down when saving the annotated PDF file. I.e. the annotation boundaries remain unchanged, but the displayed annotation content shrinks and no longer fills the boundaries. This gets worse with every save operation and applies both to newly created annotations and to elements that were saved before iPadOS 17.2. This issue only occurred after updating to iPadOS 17.2. The same code on my test device with iPadOS 17.1 works perfectly. Does anybody have a similar issue and/or found a workaround to solve this problem? Thanks for any idea!
2
1
916
Dec ’23
Where is help on Swift documentation markup?
I am reluctant to admit that I only came to know that Swift provides a builtin documentation markup syntax just a few months ago. /** Test func Some description here. - Parameters: - b:Test - d: Test - f: Test - Returns: Bool */ func myMethod(a b:Int, c d:Int, e f:Int) -&gt; Bool { b &gt; d } It seems the markup is pretty simple and has only a few keywords. But, I want to read through the complete reference. Any useful pointers?
2
0
822
Dec ’23
QLPreviewController identify markup mode
Is there any way to identify if the QLPreviewController is in markup mode? I have a custom QLPreviewController which is used to preview and edit images in my app. I need to identify when entering the markup mode and make some changes in the navigation bar based on this. But I could not find any variable or delegate methods to identify this. Any help will be appreciated.
0
0
536
Dec ’23
Xcode/Swift/MacOS app enablement problem
I have a MacOS screenshot app that was created in 2014. I've recently updated some code and libraries and am having problems with the transfer of screenshot entablements to the new app. Basically, if a user had the old version of my app they would have to delete the old enablement start the new app and then re-enable the updated version of the app. Why is this happening? It's confusing because the user sees that my app is enabled but the enablement isn't working.
1
0
424
Dec ’23
How to address elements in an EnvironmentObject Array
I am new to programing apps, and I am trying to figure out how to use one item out of an array of items in a Text line. I am not getting any complaints from Xcode, but then the preview crashes giving me a huge error report that it keeps sending to Apple. I have cut out all the extra stuff from the app to get just the basics. What I want it to print on the screed is "Hello Bill How are you?" with Bill being from the observable array. The first picture below is about 2 seconds after I removed the // from in front of the line that reads Text("(friend2.friend2[1].name)"). The other two pictures are the main app page and the page where I setup the array. At the very bottom is a text file of the huge report it kept sending to Apple, until I turned of the option of sending to Apple. Would someone please explain what I am doing wrong. Well a side from probably everything. Error code.txt
1
0
643
Dec ’23
How to declare a Protocol that conforms to Observable
I'm learning Swift — from the language docs, it seems to me that this should be possible, but I get the commented compiler error on the line below. Can someone explain why this is illegal? import SwiftUI struct ContentView: View { // 'init(wrappedValue:)' is unavailable: The wrapped value must be an object that conforms to Observable @Bindable var failing : SomeProtocol var body: some View { Text(failing.foo) } } protocol SomeProtocol : Observable { var foo: String { get } } The use case is adapting different class hierarchies to be displayed by the same View: final SwiftData model classes and dynamic models supporting editing and calculation during model creation. Apple docs on Observation and Protocol Inheritance have led me to believe this should be possible. Thanks for taking time to read (and reply).
1
0
398
Dec ’23
corebluetooth Bluetooth cannot connect to airpods
Hello everyone: I used corebluetooth, but currently I cannot connect Airpods headphones. details as following: Unable to scan to unconnected devices. If the device is already linked, it can be searched, but the connection cannot be made successfully. I searched Google for related posts but couldn't find a website that can solve the problem. I need to ask if there are any relevant restrictions on Airpods or if there are other real solutions that I can link to. thank you all.
0
0
373
Dec ’23
Store location data when app is not running using Xcode 15.1 and SwiftUI
I'm close but can't seem to get my app to store location data when not running. I'm trying to collect driver data. When I start the App it asks for location permission while using the App. At no time does the app ask me to give permission for allowing the app to collect information when I am not using the app. I think the problem revolves around this. I've also tried going into iOS settings for the app and set to Always but that didn't help. I'm likely missing something here within the app to get it to collect the data when the app is not running. Any help is appreciated. Here is what I've got. For Signing I have Background Modes enabled with "Location updates". For plist.info I have the following set with descriptions. Privacy - Location Location Always Usage Description Privacy - Location When in Use Usage Description Privacy - LocationAways and When in Use Usage Description I also have in the Info file: Required background modes with Item 0 set with "App registers for location updates" for code I have the the following in the AppDelegate: import UIKit import CoreLocation class AppDelegate: NSObject, UIApplicationDelegate { static let shared = AppDelegate() func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]? = nil) -> Bool { // Add any custom setup code here return true } } extension AppDelegate { func requestLocationAuthorization() { let locationManager = CLLocationManager() locationManager.requestAlwaysAuthorization() } } For my app here is how I trigger the app delegate and starupt the location manager import SwiftUI import SwiftData @main struct ZenTracApp: App { @UIApplicationDelegateAdaptor(AppDelegate.self) var appDelegate @Environment(\.scenePhase) var scenePhase @StateObject var locationManager = LocationManager() ... for code I have the following LocationManager: import CoreLocation import SwiftData @MainActor class LocationManager: NSObject, ObservableObject { @Published var location: CLLocation? //@Published var region = MKCoordinateRegion() private let locationManager = CLLocationManager() /// Override exiting init override init() { /// Bring in the normal init super.init() AppDelegate.shared.requestLocationAuthorization() locationManager.desiredAccuracy = kCLLocationAccuracyBest locationManager.distanceFilter = kCLDistanceFilterNone locationManager.requestAlwaysAuthorization() locationManager.startUpdatingLocation() locationManager.delegate = self locationManager.allowsBackgroundLocationUpdates = true locationManager.showsBackgroundLocationIndicator = true locationManager.startUpdatingLocation() } } extension LocationManager: CLLocationManagerDelegate { /// iOS triggers whenever the location changes func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]) { guard let newLocation = locations.last else {return} /// Check if the location has changed significantly before updating in this case more than 10 meters if self.location == nil || location!.distance(from: newLocation) > 10 { self.location = newLocation // save the location info in this function saveEntry(location: self.location!) } } }
1
0
771
Dec ’23
Core ML MLOneHotEncoder Error Post-Update: "unknown category String"
Apple Developer community, I recently updated Xcode and Core ML from version 13.0.1 to 14.1.2 and am facing an issue with the MLOneHotEncoder in my Core ML classifier. The same code and data that worked fine in the previous version now throw an error during predictions. The error message is: MLOneHotEncoder: unknown category String [TERM] expected one of This seems to suggest that the MLOneHotEncoder is not handling unknown strings, as it did in the previous version. Here's a brief overview of my situation: Core ML Model: The model is a classifier that uses MLOneHotEncoder for processing categorical data. Data: The same dataset is used for training and predictions, which worked fine before the update. Error Context: The error occurs at the prediction stage, not during training. I have checked for data consistency and confirmed that the dataset is the same as used with the previous version. Here are my questions: Has there been a change in how MLOneHotEncoder handles unknown categories in Core ML version 14.1.2? Are there any recommended practices for handling unknown string categories with MLOneHotEncoder in the updated Core ML version? Is there a need to modify the model training code or data preprocessing steps to accommodate changes in the new Core ML version? I would appreciate any insights or suggestions on how to resolve this issue. If additional information is needed, I am happy to provide it. Thank you for your assistance!
1
0
642
Dec ’23