When I try to delete items from my list stored with swift data, it instantly makes crash my app with this error message: SwiftData/BackingData.swift:482: Fatal error: Never access a full future backing data - PersistentIdentifier(id: SwiftData.PersistentIdentifier.ID(url: x-coredata://03DEFFA9-87EF-4E13-9448-946D9EBC17B6/Exercise/p8), implementation: SwiftData.PersistentIdentifierImplementation) with Optional(0252D555-649A-45B2-954C-7DD62A6DBAE4)
import SwiftUI
import SwiftData
struct WorkoutsView: View {
@Environment(\.modelContext) var modelContext
@Query(sort: [
SortDescriptor(\Workout.name),
SortDescriptor(\Workout.difficulty),
SortDescriptor(\Workout.duration)
]) var workouts: [Workout]
@State private var isEditing = false
@State private var showingAddScreen = false
var body: some View {
NavigationStack {
List {
ForEach(workouts) { workout in
//design purpose code
}
.onDelete(perform: deleteWorkouts)
}
.navigationTitle("Workouts")
.toolbar {
ToolbarItem(placement: .topBarLeading) {
EditButton()
}
ToolbarItem(placement: .topBarTrailing) {
Button(action: {
showingAddScreen = true
}) {
Image(systemName: "plus")
}
}
}
.sheet(isPresented: $showingAddScreen) {
AddWorkoutView()
}
}
}
func deleteWorkouts(at offsets: IndexSet) {
for offset in offsets {
let workout = workouts[offset]
modelContext.delete(workout)
}
}
}
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Post
Replies
Boosts
Views
Activity
I'm trying to make macOS VoiceOver read some in text in the parent tabview when the child tabview changes tabs. VoiceOver always reads the first text entry in the child sub tab ignoring attempts to switch where the focus is.
I've tried these things, in the example textItem is a member of the parent tabview class:
.setAccessibilityApplicationFocusedUIElement(textItem)
.setAccessibilityFocused(true)
Each sub tab is a view controller loaded from a storyboard and I've added code in viewDidAppear to set the accessibility focus. I've also tried using a notification to the parent tab view to set the accessibility focus at the end of the sub tab's viewDidAppear.
Nothing seems to work, is there way to actually change the current focused accessibility UI element programmatically? This needs to work on macOS 13 and greater.
Here is a rough layout of what I'm trying to accomplish. When the use selects "sub tab 2", I want the text "Text to read first" to be the focus and have VoiceOver read that. What really happens is VoiceOver reads the contents of the sub tab "Feature Name"
Hi everyone,
I’m exploring the idea of displaying a live 3D view of Google Maps in a Vision Pro app using SwiftUI and RealityKit. I want users to be able to interact with the map, including panning, zooming in and out, and exploring different areas in a fully immersive environment.
Is this technically possible within the Vision Pro ecosystem? If so, what would be the recommended approach to implement this? If not, are there any alternative methods or platforms that could provide a similar experience?
Thanks in advance for your insights!
Best,
Siddharth
can I submit the same app name and same icon with another app with same account or different account
can I submit the same app name and same icon with another app with same account or different account
when trying to connect to our server - I get an error which occurs only in the app-clip:
Error Domain=NSURLErrorDomain Code=-1004 "Could not connect to the server."
more specifically - I get this error when trying to simulate a slow connection with Network Link Conditioner (LTE / 3G).
it only happens on Network Link Conditioner , and only in the app-clip: when I disable the Network Link Conditioner it starts working again.
I receive the error quite fast, so it's not a time-out issue.
things I've tried (and didn't help):
increase time-out period
reduce the network requests to 1 at the time.
set AllowsArbitraryLoads to true
full description of the error:
Domain=NSURLErrorDomain Code=-1004 "Could not connect to the server." UserInfo={_kCFStreamErrorCodeKey=65, NSUnderlyingError=0x2826aff00 {Error Domain=kCFErrorDomainCFNetwork Code=-1004 "(null)" UserInfo={_NSURLErrorNWPathKey=satisfied (Path is satisfied), interface: en0[802.11], ipv4, dns, uses wifi, _kCFStreamErrorCodeKey=65, _kCFStreamErrorDomainKey=1}}, _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask <3D0E9D26-B6BE-41C9-9D97-C61F6E71DBE8>.<3>, _NSURLErrorRelatedURLSessionTaskErrorKey=(
"LocalDataTask <3D0E9D26-B6BE-41C9-9D97-C61F6E71DBE8>.<3>"
), NSLocalizedDescription=Could not connect to the server., NSErrorFailingURLStringKey=<<serverAddress>>, NSErrorFailingURLKey=https://<<serverAddress>>, _kCFStreamErrorDomainKey=1})
So basically using IOS voiceover it reads a radio button like this.
Aria label +"check box" + "radio button uncheck 1/2 required"
is this behavior expected for the IOS Voice over.
Thank you !
For the 5G Network Slicing Traffic Category defined by the IOS 17:
https://developer.apple.com/documentation/bundleresources/entitlements/com_apple_developer_networking_slicing_trafficcategory?changes=latest_major&language=ob_8
May I know more details, better with providing me the reference link,
on the following questions,
How this ID is being handled by the network?
What’s the flow & impact of this ID when registering the 5G SA network?
Any profile is being downloaded from the network to the handheld device to control the slice??
That ID is carry along the way up to the BBU/Cell sites and the ID is identified by the cell sites for specific handling of that slice?
I am using iphone 15pro, purchased in November 2023, had updated ios 18 beta version and since then my battery capacity is reducing very fast, in 1 week I had seen the drop of 6%.
I don't have time to go in the queue for the service, please suggest.
PLEASE HELP ME HOW TO RESOLVE
AATIR
[Personal Information Edited by Moderator]
My store requires customers to provide Identification if they are using credit card because of the amount of stolen cards in this area. How can I ensure customers aren’t using a stolen card when paying through Apple Pay? Why doesn’t Apple Pay display the cardholders name on the card shown on the screen so that we can confirm it with an ID?
after installing the latest iOS 18 developer beta on my iPhone SE 3.
VoiceOver keeps saying unpronounceable when I swipe through applications or doing anything else on my phone.
Please fix this Voiceover bug as soon as possible?
I submitted the same bug report through the Feedback Assistant app and there is no response from Apple when the issue will be resolved.
I want to capture windows of other applications for sharing during the use of SharePlay in Vision Pro. Can I use screencapturekit or is there another method?
So I was checking iOS 18 new Eye-Tracking Feature. And wonder if there is a way to check if it's on-off on Swift side because my app is using Camera, and it's conflict with the Eye-Tracking it self which also using device's Camera.
Upon further review of the activity associated with your Apple Developer Program membership, it's been determined that your membership, or a membership associated with your account, has been used for dishonest or fraudulent activity, in violation of the Apple Developer Program License Agreement. Given the severity of the identified issues, your account has been flagged for removal.
Because your account has been flagged for removal, any earnings payments are paused and app transfers are disabled. Creating new accounts after receiving this message may result in the termination of the new or associated accounts.
Evidence of Dishonest or Fraudulent Activity
App submissions from your account have repeatedly violated the App Review Guidelines in an attempt to evade the review process. After multiple resubmissions, the guideline violation(s) detailed to you in prior correspondence remain unresolved.
what to do now
Heya,
I'm currently building out my own application for tracking my health information, and I'm hoping to collect my historical data from Apple Health.
Sadly, it would appear that certain things I wish to export don't appear in the export.xml file.
Some of the things that I would expect to find in the export.xml file, that do not currently appear, are as follows:
More data about my medications, currently I can only export a list of my medications, its not possible to export data such as when what medication was taken.
Logged emotions; there is currently no support for this (that I can find).
Would appreciate some insight into this.
Hi,
I am trying to include an Image in the Apple News template which is clickable, meaning if a user clicks on that image it should be redirected to a different page. I am using action and openURL as below but it seems though the image is loading fine but its not doing anything when I click on the image.
{
"role": "container",
"components": [
{
"role": "container",
"components": [
{
"role": "image",
"URL": "https://myImageURL.jpg",
"action": {
"type": "openURL",
"URL": "https://URLToBeRedirected"
}
}
]
}
]
}
I am using Newspreview to preview my article.json. Please let me know any resolution for this issue.
When voiceover is turned on in IOS devices the video controls like pause/resume, forward , backward are not working in
inline mode and works fine in fullscreen.
The controls are announced properly in voiceover,but when we press it, it is not performing any action at all.
We used HTML5 video tag to display mp4 video in our app.we have added all necessary accessibility tags as well.
We also observe the same issue is happening in the w3school web page. Attaching the link here for reference. (https://www.w3schools.com/html/tryit.asp?filename=tryhtml5_video)
Could you please guide us why the video controls are not working. Is there anything we want to change or any update needed from the safari side.
Hello everyone
Yesterday I logged in with my Apple id on the Apple developer site. I have no subscription so some functions are not available for my account. Since I logged in, if I go to settings, general, software updates the beta updates also appear. I wanted to know if there is a way to remove this and to delete my account from Apple developer. Thank you
I’m developing an app for Vision Pro and have encountered an issue related to the UI layout and model display. Here's a summary of the problem:
I created an anchor window to display text and models in the hand menu UI.
While testing on my Vision Pro, everything works as expected; the text and models do not overlap and appear correctly.
However, after pushing the changes to GitHub and having my client test it, the text and models are overlapping.
Details:
I’m using Reality Composer Pro to load models and set them in the hand menu UI.
All pins are attached to attachmentHandManu, and attachmentHandManu is set to track the hand and show the elements in the hand menu.
I ensure that the attachmentHandManu tracks the hand properly and displays the UI components correctly in my local tests.
Question:
What could be causing the text and models to overlap in the client’s environment but not in mine? Are there any specific settings or configurations I should verify to ensure consistent behavior across different environments? Additionally, what troubleshooting steps can I take to resolve this issue?
I have created wallet non UI extension for adding card through wallet. It's working perfect when I open wallet from iPhone. But when I open wallet from Watch app (Watch bridge app on iPhone not on physical apple watch) then my extension (issuer app) is not showing there. Any idea if I need to setup or configure anything to access extension through watch bridge app wallet?
I'm developing a macOS app that interacts with Microsoft Teams using the Accessibility API. I've noticed inconsistent behavior when querying UI elements, particularly for the mute button. My queries often fail, while system tools like VoiceOver can consistently access these elements (which are visible on the screen).
In some cases, it works well, but in others, the UI elements are not visible from my code. When I try Accessibility Inspector, it also initially fails to inspect. However, the Inspector seems to have some "magical" power that, when I run it or via AX audit, appears to refresh the AX tree, and then my code occasionally works as well.
Given that VoiceOver can consistently read the screen, I assume the issue is not with the Microsoft Teams app itself (assuming it's based on Electron/React). I am mentioning this, because when I interact with Zoom app, reading the mute status from the app's menu bar, its 100% working anytime.
What would you recommend I try or explore to improve reliability?
Can I refresh the apps' AX tree from my end from swift?
Is that a bug in AX API or even in Microsoft Teams?
(have ready example and demo video, but it does not let me upload here)