When i create a intance of swift String :
Let str = String ("Hello")
As swift String are immutable, and when we mutate the value of these like:
str = "Hello world ......." // 200 characters
Swift should internally allocate new memory and copy the content to that buffer for update .
But when i checked the addresses of original and modified str, both are same?
Can you help me understand how this allocation and mutation working internally in swift String?
Swift
RSS for tagSwift is a powerful and intuitive programming language for Apple platforms and beyond.
Posts under Swift tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
When user enters in a textfield, is the input of textfield gets stored in a String ?
If yes, then String in swift being immutable, as user keeps on typing does new memory for storing that text gets allocated with each key stroke ?
And when we read users input by using delegate method textField(_ textField: UITextField, shouldChangeCharactersIn range: NSRange, replacementString string: String) from textfield.text, we get users input in a String. Is it same storage as used by textfield for storing the user input on key stroke or is it some other storage with copy of the user's input in it?
Or is UItextfield using a diffrent data structure (buffer) for storing the user input and when we do textfield.text, it gives a copy of data stored in original buffer?
When call:
[UITabBarController setViewControllers:animated:]
It crashed and raise an Fatal Exception:
Fatal Exception: NSInternalInconsistencyException Attempting to select a view controller that isn't a child! (null)
the crash stack is:
Fatal Exception: NSInternalInconsistencyException
0 CoreFoundation 0x8408c __exceptionPreprocess
1 libobjc.A.dylib 0x172e4 objc_exception_throw
2 Foundation 0x82215c _userInfoForFileAndLine
3 UIKitCore 0x38a468 -[UITabBarController transitionFromViewController:toViewController:transition:shouldSetSelected:]
4 UIKitCore 0x3fa8a4 -[UITabBarController _setSelectedViewController:performUpdates:]
5 UIKitCore 0x3fa710 -[UITabBarController setSelectedIndex:]
6 UIKitCore 0x8a5fc +[UIView(Animation) performWithoutAnimation:]
7 UIKitCore 0x3e54e0 -[UITabBarController _setViewControllers:animated:]
8 UIKitCore 0x45d7a0 -[UITabBarController setViewControllers:animated:]
And it appear sometimes, what's the root cause?
I am working on an application that supports both Arabic and English languages, allowing the user to switch between them.
The problem occurs when I switch the language to Arabic. I have restricted the user to only use the English keyboard, not the Arabic keyboard.
Whenever I enter a number in the TextInput, it automatically converts the number to the Arabic numeral font.
I want to display English numerals even when the language is set to Arabic in the TextInput view.
I have a SwiftUI app that I've been working on in XCode 16.1. The project builds and runs in the simulators, on my mac and on my iPhone/iPad without any issues. I'm also able to build my unit test project and run them without any errors. The project has zero warnings in it.
When I go to the Edit Schemes options and change the Run scheme to be a Release build with the Debug Executable unchecked I get a compiler error:
Command SwiftCompile failed with a nonzero exit code
I've attempted this Release Run with the following target devices in XCode:
My iPhone 15 Pro Max (iOS 18.2 Beta 3)
MacBook Air (M1) (15.2 Beta)
iPhone 16 Simulator (iOS 18.1)
Any iOS Simulator Device (arm64, x86_64)
All 3 of these target have the same issue. Normally I would just debug the error from the logs but when I look at the build output I can't see any information in the log to tell me what happened. It looks like the source files are sent into the SwiftCompiler and the compiler fails without bubbling up the issue.
I've provided the full error log export as a Gist HERE due to it's size. Is there anything in the log I'm missing? Is there a way for me to turn on more verbose logging during compilation of a Release Build?
I created a brand new Multiplatform App in XCode and I added all of my source files to it. No project configuration settings were changed. I could build it successfully with the debug configuration. I then changed it to the Release configuration and experienced the same error. I can create another fresh project and make the same release configuration with none of my source files in it and get a successful build. I
t seems there is something wrong with my source files and the release configuration but the compiler doesn't indicate what. I'm lost at this point as I can't figure out how to get a release build and can't seem to find any indication as to why.
Hi, I'm trying to modify the ScreenCaptureKit Sample code by implementing an actor for Metal rendering, but I'm experiencing issues with frame rendering sequence.
My app workflow is:
ScreenCapture -> createFrame -> setRenderData
Metal draw callback -> renderAsync (getData from renderData)
I've added timestamps to verify frame ordering, I also using binarySearch to insert the frame with timestamp, and while the timestamps appear to be in sequence, the actual rendering output seems out of order.
// ScreenCaptureKit sample
func createFrame(for sampleBuffer: CMSampleBuffer) async {
if let surface: IOSurface = getIOSurface(for: sampleBuffer) {
await renderer.setRenderData(surface, timeStamp: sampleBuffer.presentationTimeStamp.seconds)
}
}
class Renderer {
...
func setRenderData(surface: IOSurface, timeStamp: Double) async {
_ = await renderSemaphore.getSetBuffers(
isGet: false,
surface: surface,
timeStamp: timeStamp
)
}
func draw(in view: MTKView) {
Task {
await renderAsync(view)
}
}
func renderAsync(_ view: MTKView) async {
guard await renderSemaphore.beginRender() else { return }
guard let frame = await renderSemaphore.getSetBuffers(
isGet: true, surface: nil, timeStamp: nil
) else {
await renderSemaphore.endRender()
return }
guard let texture = await renderSemaphore.getRenderData(
device: self.device,
surface: frame.surface) else {
await renderSemaphore.endRender()
return
}
guard let commandBuffer = _commandQueue.makeCommandBuffer(),
let renderPassDescriptor = await view.currentRenderPassDescriptor,
let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) else {
await renderSemaphore.endRender()
return
}
// Shaders ..
renderEncoder.endEncoding()
commandBuffer.addCompletedHandler() { @Sendable (_ commandBuffer)-> Swift.Void in
updateFPS()
}
// commit frame in actor
let success = await renderSemaphore.commitFrame(
timeStamp: frame.timeStamp,
commandBuffer: commandBuffer,
drawable: view.currentDrawable!
)
if !success {
print("Frame dropped due to out-of-order timestamp")
}
await renderSemaphore.endRender()
}
}
actor RenderSemaphore {
private var frameBuffers: [FrameData] = []
private var lastReadTimeStamp: Double = 0.0
private var lastCommittedTimeStamp: Double = 0
private var activeTaskCount = 0
private var activeRenderCount = 0
private let maxTasks = 3
private var textureCache: CVMetalTextureCache?
init() {
}
func initTextureCache(device: MTLDevice) {
CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, device, nil, &self.textureCache)
}
func beginRender() -> Bool {
guard activeRenderCount < maxTasks else { return false }
activeRenderCount += 1
return true
}
func endRender() {
if activeRenderCount > 0 {
activeRenderCount -= 1
}
}
func setTextureLoaded(_ loaded: Bool) {
isTextureLoaded = loaded
}
func getSetBuffers(isGet: Bool, surface: IOSurface?, timeStamp: Double?) -> FrameData? {
if isGet {
if !frameBuffers.isEmpty {
let frame = frameBuffers.removeFirst()
if frame.timeStamp > lastReadTimeStamp {
lastReadTimeStamp = frame.timeStamp
print(frame.timeStamp)
return frame
}
}
return nil
} else {
// Set
let frameData = FrameData(
surface: surface!,
timeStamp: timeStamp!
)
// insert to the right position
let insertIndex = binarySearch(for: timeStamp!)
frameBuffers.insert(frameData, at: insertIndex)
return frameData
}
}
private func binarySearch(for timeStamp: Double) -> Int {
var left = 0
var right = frameBuffers.count
while left < right {
let mid = (left + right) / 2
if frameBuffers[mid].timeStamp > timeStamp {
right = mid
} else {
left = mid + 1
}
}
return left
}
// for setRenderDataNormalized
func tryEnterTask() -> Bool {
guard activeTaskCount < maxTasks else { return false }
activeTaskCount += 1
return true
}
func exitTask() {
activeTaskCount -= 1
}
func commitFrame(timeStamp: Double,
commandBuffer: MTLCommandBuffer,
drawable: MTLDrawable) async -> Bool {
guard timeStamp > lastCommittedTimeStamp else {
print("Drop frame at commit: \(timeStamp) <= \(lastCommittedTimeStamp)")
return false
}
commandBuffer.present(drawable)
commandBuffer.commit()
lastCommittedTimeStamp = timeStamp
return true
}
func getRenderData(
device: MTLDevice,
surface: IOSurface,
depthData: [Float]
) -> (MTLTexture, MTLBuffer)? {
let _textureName = "RenderData"
var px: Unmanaged<CVPixelBuffer>?
let status = CVPixelBufferCreateWithIOSurface(kCFAllocatorDefault, surface, nil, &px)
guard status == kCVReturnSuccess, let screenImage = px?.takeRetainedValue() else {
return nil
}
CVMetalTextureCacheFlush(textureCache!, 0)
var texture: CVMetalTexture? = nil
let width = CVPixelBufferGetWidthOfPlane(screenImage, 0)
let height = CVPixelBufferGetHeightOfPlane(screenImage, 0)
let result2 = CVMetalTextureCacheCreateTextureFromImage(
kCFAllocatorDefault,
self.textureCache!,
screenImage,
nil,
MTLPixelFormat.bgra8Unorm,
width,
height,
0, &texture)
guard result2 == kCVReturnSuccess,
let cvTexture = texture,
let mtlTexture = CVMetalTextureGetTexture(cvTexture) else {
return nil
}
mtlTexture.label = _textureName
let depthBuffer = device.makeBuffer(bytes: depthData, length: depthData.count * MemoryLayout<Float>.stride)!
return (mtlTexture, depthBuffer)
}
}
Above's my code - could someone point out what might be wrong?
Hello,
I am using Xcode 16.1 (16B40) on MacOS Sequoia 15.1.0 using a Macbook pro M1 Max
I am developing an app for iOS 17 and 18 using SwiftUI
I created UITests to take the screenshots for the appStore on the simulator
The tests run well and all of them are succeded
The problem appears when I try to get the screenshot files from the xcresult files after the test. There is not any screenshot inside it.
I found a data folder and a Info.plist file. In the data folder there are a lot of files with this pattern data.03zD4C6IGFFthK14NwA8mNvcwFHT16g6Tl40Tl1YmBC1bNh6d0YIcnWKyUaQPDXoa8fYo6C3Xcv8xvMtE3_NEXA== and other files with this pattern refs.03zD4C6IGFFthK14NwA8mNvcwFHT16g6Tl40Tl1YmBC1bNh6d0YIcnWKyUaQPDXoa8fYo6C3Xcv8xvMtE3_NEXA==
Ok, I tryed to use fastlane to automatize the screenshots but the problem is still present. The xcresult files have not any png file.
I had no problems doing this action (getting screenshots from a xcresult file) in previous versions of MacOS and Xcode in my current machine.
I just updated my machine to MacOS Sequoia 15.1.1 and the problem is still present
Honestly I don't know how to fix this situation.
With Xcode 15 I had not any problem with that but I am not sure if Xcode 16.0 was runing without problems because I didn't need to use this functionality in those months
Here is my code for a UITest:
import XCTest
final class ScreenshotsUITests: XCTestCase {
let app = XCUIApplication()
let device = "iPhone16"
override func setUpWithError() throws {
continueAfterFailure = true
}
override func tearDownWithError() throws {}
@MainActor
func testEnglishScreens() throws {
let lang = "en"
app.launchArguments.append("UITestMode")
app.launchArguments += ["-AppleLanguages", "(en)"]
app.launchArguments += ["-AppleLocale", "en_US"]
app.launch()
executeTestsForMenus(lang: lang, backLabel: "Back")
executeTestForMatch(lang: lang)
}
@MainActor
func testSpanishScreens() throws {
let lang = "es"
app.launchArguments.append("UITestMode")
app.launchArguments += ["-AppleLanguages", "(es)"]
app.launchArguments += ["-AppleLocale", "es_ES"]
app.launch()
executeTestsForMenus(lang: lang, backLabel: "Atrás")
executeTestForMatch(lang: lang)
}
private func executeTestForMatch(lang: String) {
let startButton = app.buttons["start-button"]
startButton.tap()
let key4 = app.buttons["key-4"]
XCTAssertTrue(key4.waitForExistence(timeout: 30), "Key 4 in match screen is not found")
key4.tap()
let key2 = app.buttons["key-2"]
XCTAssertTrue(key2.exists, "Key 2 in match screen is not found")
key2.tap()
makeScreenShot("playing", lang: lang)
let closeButton = app.buttons["close-button"]
XCTAssertTrue(closeButton.exists, "Close button in match screen is not found")
closeButton.tap()
}
private func executeTestsForMenus(lang: String, backLabel: String) {
let mainHeader = app.staticTexts["Math match"]
XCTAssertTrue(mainHeader.exists, "Header in main screen is not found")
makeScreenShot("mainMenu", lang: lang)
let settingsButton = app.buttons["settings-button"]
XCTAssertTrue(settingsButton.exists, "Settings button in main screen is not found")
settingsButton.tap()
makeScreenShot("Settings", lang: lang)
let backButton = app.buttons[backLabel]
XCTAssertTrue(backButton.exists, "Back button in match screen is not found")
backButton.tap()
let helpButton = app.buttons["help-button"]
XCTAssertTrue(helpButton.exists, "Help button in main screen is not found")
helpButton.tap()
makeScreenShot("Help", lang: lang)
backButton.tap()
let scoreButton = app.buttons["score-button"]
XCTAssertTrue(scoreButton.exists, "Scores button in main screen is not found")
scoreButton.tap()
makeScreenShot("Scores", lang: lang)
backButton.tap()
let playButton = app.buttons["play-button"]
XCTAssertTrue(playButton.exists, "Play button in main screen is not found")
playButton.tap()
makeScreenShot("matchBuilder", lang: lang)
let startButton = app.buttons["start-button"]
XCTAssertTrue(startButton.exists, "Start button in match builder screen is not found")
}
private func makeScreenShot(_ name: String, lang: String) {
takeScreenshot(app, named: "\(lang)-\(name)-\(device)")
}
}
import XCTest
extension XCTestCase {
func takeScreenshot(_ app: XCUIApplication, named name: String, fullScreen: Bool = false) {
let screenshot: XCUIScreenshot
if fullScreen {
screenshot = app.windows.firstMatch.screenshot()
} else {
screenshot = XCUIScreen.main.screenshot()
}
let screenshotAttachment = XCTAttachment(
uniformTypeIdentifier: "public.png",
name: "screenshot-\(name).png",
payload: screenshot.pngRepresentation,
userInfo: nil)
screenshotAttachment.lifetime = .keepAlways
add(screenshotAttachment)
}
}
and here is the content of my testplan file:
{
"configurations" : [
{
"id" : "35BC7C0B-9A5A-4027-9F30-36958C4C1AAF",
"name" : "Test Scheme Action",
"options" : {
"preferredScreenCaptureFormat" : "screenshot",
"testExecutionOrdering" : "random",
"uiTestingScreenshotsLifetime" : "keepAlways",
"userAttachmentLifetime" : "keepAlways"
}
}
],
"defaultOptions" : {
"targetForVariableExpansion" : {
"containerPath" : "container:myAppProject.xcodeproj",
"identifier" : "B27D1B022CA00314001A259B",
"name" : "MyAppProject"
}
},
"testTargets" : [
{
"parallelizable" : true,
"target" : {
"containerPath" : "container:MyAppProject.xcodeproj",
"identifier" : "B27D1B122CA00315001A259B",
"name" : "MyAppProjectTests"
}
},
{
"parallelizable" : true,
"target" : {
"containerPath" : "container:MyAppProject.xcodeproj",
"identifier" : "B27D1B1C2CA00315001A259B",
"name" : "MyAppProjectUITests"
}
}
],
"version" : 1
}
I made tests with old projects in my machine and those projects have the same problem with screenshot files in the xcresult bundles
I don't know if the problem is in my machine, my Xcode, MacOS or other ting. I don't know how to fix this problem
Please, can anyone help me?
Thanks in advance
Is there a way to know the event of user unlocking on iOS Device in Application?
Hi team,
We're using CocoaPods in our project and we noticed the compiler fails to build certain targets saying it's "Missing required module 'x'" when trying to build them in Swift 6:
We realized the modules the compiler is complaining about are pod dependencies required by the other target dependencies, and that this error will only appear when building with Swift 6 unless such dependencies are described on the Podfile as direct dependencies of the target, or we include them in the framework search paths. For example, the error in the image above will show if module 'X' import 'Y' and 'Y' imports 'CocoaLumberJack' and we don't specify a direct dependency between 'X' and 'CocoaLumberJack'.
Regardless of the fact that we can manually add the missing module location to the target search paths, we'd like to understand why we're facing this issue in the first place, what changed between Swift 5 and Swift 6 that's requiring us now to explicitly describe these dependencies. I'd appreciate if someone could tell us more about this. We're particularly interested on knowing if this is an intentional change and how to handle it properly.
Thanks
Hi the app crashes with message: **thunk for @escaping @callee_guaranteed (@guaranteed NSURLSessionTask) -> () + 48 (:0) ** and I am unable to understand what it means or why is it happening. Want to understand the reason for the message.
Thank you.
Hi,
I'm developing a musicKit integration in my iOS App, and I want to select songs from recently played (done it), the problem is that the queue is not auto-generated and the user have to select other song if they want to go forward.
There is any method to ask for similar songs, or recommended songs, from a song that the user has already selected?
It will be really great :)
Also if you know it... There is any publisher for the music duration or I need to do a timer?? Thanks.
David.
I have a swiftui view with Button(intent: ) and using UIHostingViewcontroller to use it in UIKit. The problem is that button not works in uikit but normal button(without intent works)
Hello everyone,
I'm currently working through this example project Exploring object tracking with ARKit to learn how to use Object Tracking with visionOS.
I was able to modify the example project to overlay a different model onto the detected object. However, with the example code, I'm only able to track 1 instance of the trained object at a time. I'm wondering if there is a way to track multiple instances of 1 object?
For example, I have a usdz model of a box, then trained that box for object tracking, I'm able to overlay a model of a chair over that box once it's detected. But now, I have multiple copies of that same box, and I want to arrange them so that when I wear the Vision Pro, I can see the chairs arranged however I want.
I'm still new to visionOS development, so I'm not sure if there's a way to accomplish that by just training 1 object and having copies of it.
If it helps, this is my current modification to overlay a virtual object ontop of a detected object.
func loadReferenceObjects() async -> [ReferenceObject] {
// Get a list of all reference object files in the app's main bundle and attempt to load each.
var referenceObjectFiles: [String] = []
if let resourcesPath = Bundle.main.resourcePath {
print("resource path: \(resourcesPath)")
try? referenceObjectFiles = FileManager.default.contentsOfDirectory(atPath: resourcesPath).filter { $0.hasSuffix(".referenceobject") }
}
await withTaskGroup(of: Void.self) { group in
for file in referenceObjectFiles {
let objectURL = Bundle.main.bundleURL.appending(path: file)
group.addTask {
// get the file name
let fileNameWithoutExtension = (file as NSString).deletingPathExtension
// load each ref objs as task
await self.loadSingleReferenceObject(url: objectURL, fileName: fileNameWithoutExtension)
}
}
}
return self.referenceObjects
}
// Private helper method to load a single object
// and assign entity to it.
private func loadSingleReferenceObject(url: URL, fileName: String) async {
var referenceObject: ReferenceObject
do {
print("Loading reference object from \(url)")
// Load the file as a `ReferenceObject` - this can take a while for larger objects.
try await referenceObject = ReferenceObject(from: url)
} catch {
fatalError("Failed to load reference object with error \(error)")
}
// add the ref obj to the ref objs array
self.referenceObjects.append(referenceObject)
// entity with each file
var model: Entity = Entity()
// add entity according to the file name
switch fileName {
// Box1 ref obj binds with Chair1
case "Box1":
// try to load the model
do {
try await model = Entity(named: "chair1", in: realityKitContentBundle)
} catch {
print("Failed to load chair1")
}
// Box2 ref obj binds with Chair2
case "Box2":
// try to load the model
do {
try await model = Entity(named: "chair2", in: realityKitContentBundle)
} catch {
print("Failed to load chair2")
}
default:
print("no model associated with this file name: \(fileName)")
break
}
// map entity to ref objs
usdzPerReferenceObject[referenceObject.id] = model
}
Any help or suggestions would be greatly appreciated.
Thank you.
Hi,
I am implementing the synchronisation of SwiftData with CloudKit as described in the Apple Documentation titled - "Syncing model data across a person’s devices." My app runs fine on iPhone without activating CloudKit under "Signing and Capabilities" option. But when activated, I get a CoreData error with a code: 134060. My app is in development stage. The following is the code snippet for your reference taken from the main structure adopting the App protocol.
init() {
do {
#if DEBUG
let schema = Schema([
Debit.self,
Credit.self,
])
let modelConfiguration = ModelConfiguration(schema: schema, isStoredInMemoryOnly: false)
// Use an autorelease pool to make sure Swift deallocates the persistent
// container before setting up the SwiftData stack.
try autoreleasepool {
let desc = NSPersistentStoreDescription(url: modelConfiguration.url)
let opts = NSPersistentCloudKitContainerOptions(containerIdentifier: "iCloud.com.sureshco.MyFirstApp")
desc.cloudKitContainerOptions = opts
// Load the store synchronously so it completes before initializing the CloudKit schema.
desc.shouldAddStoreAsynchronously = false
if let mom = NSManagedObjectModel.makeManagedObjectModel(for: [
Debit.self,
Credit.self,
]) {
let container = NSPersistentCloudKitContainer(name: "MyFirstApp", managedObjectModel: mom)
container.persistentStoreDescriptions = [desc]
container.loadPersistentStores {_, err in
if let err {
fatalError(err.localizedDescription)
}
}
// Initialize the CloudKit schema after the store finishes loading.
try container.initializeCloudKitSchema()
// Remove and unload the store from the persistent container.
if let store = container.persistentStoreCoordinator.persistentStores.first {
try container.persistentStoreCoordinator.remove(store)
}
}
}
#endif
sharedModelContainer = try ModelContainer(for: schema, configurations: [modelConfiguration])
} catch {
fatalError(error.localizedDescription)
}
}
Any help will be greatly appreciated!
Regards
Suresh.
Capability to read and write ofd HFS disks on Mac has been removed since a long time.
Capability to simply read was also removed since Catalina I think.
That is surprising and sometimes frustrating. I still use a 90's MacBook for a few tasks and need from time to time to transfer files to newer Mac or read some old files stored on 3.5" disks.
Solution I use is to read the disk on an old Mac with MacOS 10.6 (I'm lucky enough to have kept one) and transfer to USB stick or airdrop…
As there is no USB port on the Macbook of course (and I have no more a working 56k modem to transfer by mail), only option if not 3,5" disk is using PCMCIA port on the MacBook for writing to an SD Card to be read in Mac Sonoma. But reading directly 3.5" disk would be great.
Hence my questions for the forum:
how hard would it be to write such a driver for READING only HFS on Mac Sonoma?
There are some software like FuseHFS. Did anyone experience it ? Did anyone have a look at the source code (said to be open source).
does anyone know why Apple removed such capability (I thought it was a tiny piece of code compared to the GB of present MacOS)?
Thanks for any insights on the matter.
Hello, I'm kinda new to Xcode and I was trying to get back to programming while I follow a course.
There seemed to be a problem with the preview and it looks like there was nothing wrong with the code, at least from what I know. I can say this because the code is the default one that Xcode gives you when you start a new project.
here is the error log.
== PREVIEW UPDATE ERROR:
SchemeBuildError: Failed to build the scheme “App”
linker command failed with exit code 1 (use -v to see invocation)
Link App.debug.dylib (arm64):
Undefined symbols for architecture arm64:
"_main", referenced from:
___debug_main_executable_dylib_entry_point in command-line-aliases-file
ld: symbol(s) not found for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
== PREVIEW UPDATE ERROR:
SchemeBuildError: Failed to build the scheme “App”
linker command failed with exit code 1 (use -v to see invocation)
Link App.debug.dylib (arm64):
Undefined symbols for architecture arm64:
"_main", referenced from:
___debug_main_executable_dylib_entry_point in command-line-aliases-file
ld: symbol(s) not found for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
I am encountering an issue when making an API call using URLSession with DispatchQueue.global(qos: .background).async on a real device running tvOS 18. The code works as expected on tvOS 17 and in the simulator for tvOS 18, but when I remove the debug mode, After the API call it takes few mintues or 5 to 10 min to load the data on the real device.
Code: Here’s the code I am using for the API call:
appconfig.getFeedURLData(feedUrl: feedUrl, timeOut: kRequestTimeOut, apiMethod: ApiMethod.POST.rawValue) { (result) in
self.EpisodeItems = Utilities.sharedInstance.getEpisodeArray(data: result)
}
func getFeedURLData(feedUrl: String, timeOut: Int, apiMethod: String, completion: @escaping (_ result: Data?) -> ()) {
guard let validUrl = URL(string: feedUrl) else { return }
var request = URLRequest(url: validUrl, cachePolicy: .useProtocolCachePolicy, timeoutInterval: TimeInterval(timeOut))
let userPasswordString = "\(KappSecret):\(KappPassword)"
let userPasswordData = userPasswordString.data(using: .utf8)
let base64EncodedCredential = userPasswordData!.base64EncodedString(options: .lineLength64Characters)
let authString = "Basic \(base64EncodedCredential)"
let headers = [
"authorization": authString,
"cache-control": "no-cache",
"user-agent": "TN-CTV-\(kPlateForm)-\(kAppVersion)"
]
request.addValue("application/json", forHTTPHeaderField: "Content-Type")
request.httpMethod = apiMethod
request.allHTTPHeaderFields = headers
let response = URLSession.requestSynchronousData(request as URLRequest)
if response.1 != nil {
do {
guard let parsedData = try JSONSerialization.jsonObject(with: response.1!, options: .mutableContainers) as? AnyObject else {
print("Error parsing data")
completion(nil)
return
}
print(parsedData)
completion(response.1)
return
} catch let error {
print("Error: \(error.localizedDescription)")
completion(response.1)
return
}
}
completion(response.1)
}
import Foundation
public extension URLSession {
public static func requestSynchronousData(_ request: URLRequest) -> (URLResponse?, Data?) {
var data: Data? = nil
var responseData: URLResponse? = nil
let semaphore = DispatchSemaphore(value: 0)
let task = URLSession.shared.dataTask(with: request) { taskData, response, error in
data = taskData
responseData = response
if data == nil, let error = error {
print(error)
}
semaphore.signal()
}
task.resume()
_ = semaphore.wait(timeout: .distantFuture)
return (responseData, data)
}
public static func requestSynchronousDataWithURLString(_ requestString: String) -> (URLResponse?, Data?) {
guard let url = URL(string: requestString.checkValidUrl()) else { return (nil, nil) }
let request = URLRequest(url: url)
return URLSession.requestSynchronousData(request)
}
}
Issue Description: Working scenario: The API call works fine on tvOS 17 and in the simulator for tvOS 18. Problem: When running on a real device with tvOS 18, the API call takes time[enter image description here] when debug mode is disabled, but works fine when debug mode is enabled, Data is loading after few minutes.
Error message: Error Domain=WKErrorDomain Code=11 "Timed out while loading attributed string content" UserInfo={NSLocalizedDescription=Timed out while loading attributed string content} NSURLConnection finished with error - code -1001 nw_read_request_report [C4] Receive failed with error "Socket is not connected" Snapshot request 0x30089b3c0 complete with error: <NSError: 0x3009373f0; domain: BSActionErrorDomain; code: 1 ("response-not-possible")> tcp_input [C7.1.1.1:3] flags=[R] seq=817957096, ack=0, win=0 state=CLOSE_WAIT rcv_nxt=817957096, snd_una=275546887
Environment: Xcode version: 16.1 Real device: Model A1625 (32GB) tvOS version: 18.1
Debugging steps I’ve taken: I’ve verified that the issue does not occur in debug mode. I’ve confirmed that the API call works fine on tvOS 17 and in the simulator (tvOS 18). The error suggests a network timeout (-1001) and a socket connection issue ("Socket is not connected").
Questions:
Is this a known issue with tvOS 18 on real devices? Are there any specific settings or configurations in tvOS 18 that could be causing the timeout error in non-debug mode? Could this be related to how URLSession or networking behaves differently in release mode? I would appreciate any help or insights into this issue!
I am trying to use the DuckDB library in a SwiftUI app. It does not seem to be able to allocate as much memory as it is available to it, and I was wondering if there is any setting I can tweak. I've logged the issue in their GitHub repo too, with full details.
I can't seem to find much documentation on the observe(_:options:changeHandler:) method itself. There is this page which has some example use, but there is no link to a 'real' documentation page for the method. Additionally the 'quick help' has a little bit of info, but no parameter definitions or explanation of if/how the return value or changeHandler closure are retained.
Have a rare crash in one of our framework which I am unable to figure out.
Attaching the crash log here. Any idea how to debug this to find the root cause?
Crash.crash