Accessibility

RSS for tag

Make your apps function for a broad range of users using Accessibility APIs across all Apple platforms.

Posts under Accessibility tag

122 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

How can we make elements which are grouped accessible for automation
I have a stackview which have 2 labels class TextView: UIView { @IBOutlet private weak var stackView: UIStackView! { didSet { stackView.isAccessibilityElement = true stackView.accessibilityLabel = label1.text + label2.text } } @IBOutlet private weak var label1: UILabel! { didSet { label1.accessibilityIdentifier = "label1" } } @IBOutlet private weak var: UILabel!{ didSet { label2.accessibilityIdentifier = "label2" } } } My goal here is to have a combines accessibility label for the stackview and yet able to access the accessibilityIdentifier of child elements for automation.
0
0
275
3d
VoiceOver needs to support CFBundleSpokenName
VoiceOver does not support the plist property CFBundleSpokenName. This is wrong and should be fixed. Ultimately the issue I am dealing with is that our app name is UWCU, and instead of VoiceOver pronouncing each letter, it tries to read this as a word and horribly butchers our organization's/app's name. Alternatives such as using U.W.C.U. and U W C U are not acceptable. @Apple, I know you're first response is going to be "no, it is working perfectly," but quite frankly you are wrong. I know you feel strongly about this, given your response in posts like this: https://forums.developer.apple.com/forums/thread/734545?answerId=760084022 HOWEVER, with iOS 18, your argument for "VoiceOver should read what's on the screen" doesn't hold water anymore. With iOS 18, you Apple have added a new feature that lets users customize their home screens and completely remove the name of apps. Here's your own guide: https://support.apple.com/guide/iphone/customize-apps-and-widgets-on-the-home-screen-iph385473442/ios Quoted from your guide: Make the icons bigger: Tap Large. (In large size, the names of the apps disappear.) With large icons + VoiceOver turned on, VoiceOver still reads the app name even though it has disappeared from the screen. So, your own argument "VoiceOver should read the text as it appears on the screen" is invalid, because there is NO text on the screen. If you can't tell, I'm pretty peeved about all this. There's a reason why screen readers support aria attributes to help deliver the right accessible experience. It's a simple ask for VoiceOver to do the same thing.
3
0
421
1w
UIApplication.shared.open
In iOS 17, the call to "UIApplication.shared.open("App-prefs:ACCESSIBILITY&path=HEARING_AID_TITLE")" was opening the device Settings and going to Accessibility and then Hearing Device which is very helpful. Now in iOS 18, this call only opens the device Settings at the root. I would like to know how to replace the URL so that it works like before. canOpenUrl does return true, so I'm wondering if something is broken, or if canOpenUrl is kind of lying a bit. I also tried other paths to go to other screens and they don't work either.
1
0
154
3d
"AVSpeechSynthesisVoice" choppy at start.
So, I'm trying to create my own text-to-speech setup. Problem I'm having is whenever I do a test run, the speech gets a bit choppy at the start kind of skipping over maybe a word or a few characters. A few details: I've essentially built a separate class for handling the speech events. AVSpeechSynthesizer is set up as a private variable for the class so I don't expect deallocation to be the issue. Especially since it's a problem at the start. I've got a queue set up for what it's worth so that shouldn't be a problem. I'd appreciate any advice.
2
0
111
2w
Resetting the selected accessibility action on a button
I have a record button that either starts or stops a recording using the default action. When the user is recording, I want to add a custom action to discard the recording instead of saving it. That all works fine with the following code: if isRecording { recordButton.accessibilityCustomActions = [ .init(name: String(localized: "discard recording"), actionHandler: { [weak delegate] _ in delegate?.discardRecording() return true }) ] recordButton.accessibilityLabel = String(localized: "stop recording", comment: "accessibility label") } else { recordButton.accessibilityCustomActions = [] recordButton.accessibilityLabel = String(localized: "start recording", comment: "accessibility label") } The problem I have is that when a user chose "discard recording", it becomes the default selected action again the next time the user records, and instead of stopping and saving the recording, the user might accidentally discard the next one as well. How can I programmatically reset the selected action on this recordButton to the default action?
0
0
156
2w
Add words to Voice Control
I want to create a utility to import a list of words to the Voice Control user custom vocabulary. Is there an API to do this? I noticed if you use the built-in export vocabulary functionality (Settings > Accessibility > Voice Control > ...) the file that gets exported is a plist document type. If there is no API to add words programmatically should I just create a utility that generates a plist file and import it using the built-in import vocabulary functionality (Settings > Accessibility > Voice Control > ...)?
3
0
162
2w
[tvOS] VoiceOver Skips Description Text When Info Panel Opens in AVPlayerViewController
When the native info panel (which displays the title, subtitle, description, and custom buttons) opens, the focus immediately shifts to the first button. As a result, VoiceOver skips the description, which is crucial for users relying on accessibility features. I haven’t found a way to detect when it opens. Knowing this would allow me to trigger custom VoiceOver announcements or adjust the focus order dynamically. Are any other people experiencing this issue, and how do we solve it?
0
0
179
2w
A phone keyboard layout for easy typing!
Last November 13 I came up with a phone keyboard layout (strategy) that can make key size bigger hence less mistyping. The typical phone keyboard looks like this: My proposed keyboard looks like this: Essentially, it's a split keyboard with the left-hand part stacked above/below the right-hand part. Key size/width/height and the vertical distance between the left-hand part and right-hand part may be adjustable to suit different phone widths and user hand sizes. You guys can show the proposed keyboard's image on your phone and fit this keyboard to your phone width so you can actually simulate typing on it to see how it feels. On my phone, the letter keys in it are a little too big for my thumbs to reach the farthest keys, but as I said, key size should be adjustable to suit different phone widths and user hand sizes.
0
0
271
2w
Triggering keyboard/mouse events anytime iOS
I frequently use my iPad to develop remotely in VSCode or prototype designs in Figma. This is currently all done in the browser. Given that many of these experiences rely heavily on the keyboard I was hoping there would be a solution to make a keyboard persistent on the screen, or at least a few hot keys. Is it possible for me to develop an accessibility tool that could stay persistent on the screen? Perhaps something that would talk with assistive touch? Or is that in Apple’s no no square?
0
0
233
2w
CoreGraphics: CGPDFPageCopyRootTaggedNode
My app uses PDFKit, but I don't know how to solve this bug at all. Under the same IOS system and device model, some users' devices may experience crashes, while our own devices are functioning normally. The following is the stack information for crashing: 0 libsystem_platform.dylib__os_unfair_lock_recursive_abort + 36 1 libsystem_platform.dylib__os_unfair_lock_lock_slow + 308 2 CoreGraphics_CGPDFPageCopyRootTaggedNode + 56 3 PDFKit-[PDFPageViewAccessibility accessibilityElements] + 76 4 UIAccessibility-[NSObject(AXPrivCategory) _accessibilityElements] + 56 5 UIAccessibility-[NSObjectAccessibility accessibilityElementCount] + 68 6 UIAccessibility-[NSObject(AXPrivCategory) _accessibilityHasOrderedChildren] + 44 7 UIAccessibility-[NSObject(AXPrivCategory) _accessibilityFrameForSorting] + 216 8 UIAccessibility-[NSObject _accessibilityCompareGeometry:] + 116 9 UIAccessibility-[NSObject(AXPrivCategory) accessibilityCompareGeometry:] + 52 10 CoreFoundation___CFSimpleMergeSort + 100 11 CoreFoundation___CFSimpleMergeSort + 248 12 CoreFoundation_CFSortIndexes + 260 13 CoreFoundation-[NSArray sortedArrayFromRange:options:usingComparator:] + 732 14 CoreFoundation-[NSMutableArray sortedArrayFromRange:options:usingComparator:] + 60 15 CoreFoundation-[NSArray sortedArrayUsingSelector:] + 168 16 UIAccessibility___57-[NSObject(AXPrivCategory) _accessibilityFindDescendant:]_block_invoke + 268 17 UIAccessibility___96-[NSObject(AXPrivCategory) _accessibilityEnumerateAXDescendants:passingTest:byYieldingElements:]_block_invoke + 140 18 UIAccessibility-[NSObject _accessibilityEnumerateAXDescendants:passingTest:byYieldingElements:] + 244 19 UIAccessibility-[NSObject _accessibilityFindFirstAXDescendantPassingTest:byYieldingElements:] + 272 20 UIAccessibility-[NSObject(AXPrivCategory) _accessibilityFindDescendant:] + 100 21 UIAccessibility__axuiElementForNotificationData + 276 22 UIAccessibility__massageAssociatedElementBeforePost + 36 23 UIAccessibility__UIAXBroadcastMainThread + 292 24 libdispatch.dylib__dispatch_call_block_and_release + 32 25 libdispatch.dylib__dispatch_client_callout + 20 26 libdispatch.dylib__dispatch_main_queue_drain + 980 27 libdispatch.dylib__dispatch_main_queue_callback_4CF + 44 28 CoreFoundation___CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16 29 CoreFoundation___CFRunLoopRun + 1996 30 CoreFoundation_CFRunLoopRunSpecific + 572 31 GraphicsServices_GSEventRunModal + 164 32 UIKitCore-[UIApplication _run] + 816 33 UIKitCore_UIApplicationMain + 340 34 SwiftUIclosure #1 (Swift.UnsafeMutablePointer<Swift.UnsafeMutablePointer<Swift.Int8>?>) -> Swift.Never in SwiftUI.(KitRendererCommon in _ACC2C5639A7D76F611E170E831FCA491)(Swift.AnyObject.Type) -> Swift.Never + 168 35 SwiftUI SwiftUI.runApp(A) -> Swift.Never + 100 36 SwiftUI static (extension in SwiftUI):SwiftUI.App.main() -> () + 180
2
2
172
3w
Crash when changing the accessibilityElements to custom UIAccessibilityElement
I got a UIControl, and I want to make it behavior like a custom UIAccessibilityElement. UIControl *control = [[UIControl alloc] init]; control.isAccessibilityElement = NO; CustomAccessibilityElement *elem = [[CustomAccessibilityElement alloc] initWithAccessibilityContainer:control]; elem.isAccessibilityElement = YES; // some custom setting here control.accessibilityElements = @[elem]; It worked well with an iPhone 13, iOS 15.5.1, but crashed with an iPhone SE, iOS 15.4.1 and the crash msg is : "-[UIAccessibilityElement _addAccessibilityElementsAndOrderedContainersWithOptions:toCollection:]: unrecognized selector sent to instance 0x283b7c680" Can you tell me the reason ? Thanks a lot.
1
0
181
3w
`accessibilityUserInputLabels` is ignored on `UIBarButtonItem`
accessibilityUserInputLabels is working fine with any view I tried this on. Meaning that the control can be toggled with the provided alternative names when using Voice Control. When setting this property on any UIBarButtonItem though, it seems Voice Control ignores the alternative names provided by setting accessibilityUserInputLabels. For comparison, accessibilityLabel works perfectly when set on UIBarButtonItem. Is anyone facing the same issue? Using Xcode 16.0 (16A242) on iOS 18
2
0
278
2w
Issue: Device Dock pops up on switching the app to or from Single App mode
I have an issue while unlocking the app from Single App mode. Device Dock is popping up on top of my app and this is disturbing the experience of my app. I have already done MDM configuration and the indented functionality is working fine with the below code, the app is successfully switching to single app mode and back. The sample code below reproduces the issue. Tap on Lock, the completion blocks returns true and the app is successfully switched to single app mode. Tap on Unlock, the completion block returns true and the app is successfully switched from the single app mode. Now the device dock is popped up on top of the app. struct LockApp: App { var body: some Scene { WindowGroup { VStack { Button("Lock", systemImage: "lock") { UIAccessibility.requestGuidedAccessSession(enabled: true) { success in print("App has been locked", success) } } .buttonStyle(.borderedProminent) Button("UnLock", systemImage: "lock.open") { UIAccessibility.requestGuidedAccessSession(enabled: false) { success in print("App has been unlocked", success) } } .buttonStyle(.borderedProminent) } } } } Xcode Version: 16.0 Device: iPad Pro 12.9 inch 6th gen. OS: 18.1 Is this intended behaviour? Has anyone come across this issue?
0
0
120
4w
VoiceOver focus jumping around when adding UIViewController as Child to another UIViewController
I'm facing an accessibility issue, where when I call UIViewController.addChild(_:) and pass in another instance of a UIViewController, the VoiceOver focus is jumping to the "Back" button in the navigation bar. How might one go about avoid this behaviour and having the accessibility/voiceover focus remain where it was at the time of adding the child?
1
0
213
Oct ’24
guidedAccessStatusDidChangeNotification does not get called on visionOS
I am trying to get a Notification if Guided access is enabled or disabled on the VisionPro. For doing so you would normally just call: NotificationCenter.default.addObserver(forName: UIAccessibility.guidedAccessStatusDidChangeNotification, object: nil, queue: .main){ noti in print("guided access did change") } and this works fine on iOS devices. But running the exact same code in Vision os Results in not getting a notification at all, even though Guided Access gets enabled or Disabled. For testing i ran a simple default app, that works perfectly on both os types. import SwiftUI struct ContentView: View { var body: some View { VStack { Image(systemName: "globe") .imageScale(.large) .foregroundStyle(.tint) Text("Hello, world!") } .onAppear{ print("is appeairng") NotificationCenter.default.addObserver(forName: UIAccessibility.guidedAccessStatusDidChangeNotification, object: nil, queue: .main){_ in print("guided access did change") } } .padding() } } But as said it prints "guided access did change on iOS" but not on the Vision Pro.
0
0
239
Sep ’24
Personal Voice authorization requires app restart
I'm trying to include Apple's Personal Voice feature in an app I'm working on, but I want to use a button or toggle to request access, rather than firing the request on first launch. The problem is that, if AVSpeechSynthesizer is used during the same session, before Personal Voice is authorized, the app has to be restarted to use the feature. Here is a basic example that demonstrates the issue on my iPhone (running 18.1 beta, but the issue was present at least in 18.0, maybe before): import AVFoundation import SwiftUI struct TestView: View { let synthesizer = AVSpeechSynthesizer() @State private var personalVoices: [AVSpeechSynthesisVoice] = [] var body: some View { VStack(spacing: 100) { Text("Personal Voices Available: \(personalVoices.count)") Button { speakUtterance(string: "Hello, world!") } label: { Image(systemName: "hand.wave.fill") .font(.system(size: 100)) } Button("Fetch Personal Voices") { Task { await fetchPersonalVoices() } } } } func fetchPersonalVoices() async { AVSpeechSynthesizer.requestPersonalVoiceAuthorization() { status in if status == .authorized { personalVoices = AVSpeechSynthesisVoice.speechVoices().filter { $0.voiceTraits.contains(.isPersonalVoice) } } } } func speakUtterance(string: String) { let utterance = AVSpeechUtterance(string: string) if let voice = personalVoices.first { utterance.voice = voice } else { utterance.voice = AVSpeechSynthesisVoice(language: Locale.preferredLanguages[0]) } synthesizer.speak(utterance) } } If you tap the hand symbol first (before authorizing Personal Voice), you'll probably notice that the Personal Voices Available number never increases. If you authorize Personal Voice before tapping the hand symbol, it should speak using your Personal Voice as expected. The example code is mostly taken directly from this WWDC23 video (Personal Voice info begins around the 10-minute mark). Does anyone have any idea what could be causing this? Note: Personal Voice can't be tested in Simulator. The code will need to be run on a physical device that has Personal Voice set up, to test.
0
0
238
Sep ’24
VoiceOver ignoring a data series when there are multiple ones
I can't figure out if I've found a VoiceOver problem with Swift Charts or if I'm doing something incorrectly. I have a loop within a loop showing 2 sets of data in the same chart. If I touch a month then VO correctly says there are two data series. But if I keep swiping down only data from the first series is read. ChatGPT said try referencing the outer loop and sure enough that worked if it done in BOTH the label and value. It sounds really awkward though. For example, "High 89 degrees F High October". Below the "bad" chart only says something such as "92 degrees F October" when swiping down. The "good" chart will read the high and low temperature data. VStack { headerText("BAD") Chart { ForEach(processedMonthlyInput) { oneMonth in ForEach(oneMonth.temperatures, id: \.month) { element in LineMark( x: .value("Month", element.month, unit: .month), y: .value("Temperature", element.tempVal.converted(to: .fahrenheit).value) ) .accessibilityLabel("\(element.month.formatted(.dateTime.month(.wide)))") .accessibilityValue(Text("\(element.tempVal.converted(to: tempUnit).formatted(.measurement(width: .abbreviated, numberFormatStyle: .number.precision(.fractionLength(0)))))")) } .symbol(by: .value("Type", oneMonth.theType)) .foregroundStyle(by: .value("Type", oneMonth.theType)) .interpolationMethod(.catmullRom) } } .frame(maxHeight: paddingAmount) .padding(.horizontal) headerText("GOOD") Chart { ForEach(processedMonthlyInput) { oneMonth in ForEach(oneMonth.temperatures, id: \.month) { element in LineMark( x: .value("Month", element.month, unit: .month), y: .value("Temperature", element.tempVal.converted(to: .fahrenheit).value) ) .accessibilityLabel("\(oneMonth.theType) \(element.month.formatted(.dateTime.month(.wide)))") .accessibilityValue(Text("\(oneMonth.theType) \(element.tempVal.converted(to: tempUnit).formatted(.measurement(width: .abbreviated, numberFormatStyle: .number.precision(.fractionLength(0)))))")) } .symbol(by: .value("Type", oneMonth.theType)) .foregroundStyle(by: .value("Type", oneMonth.theType)) .interpolationMethod(.catmullRom) } } .frame(maxHeight: paddingAmount) .padding(.horizontal) }
0
0
176
Sep ’24
Urgent: CursorUIViewService & hiservices-xpcservice Issues
Hi Everyone, I would appreciate your help with the topic mentioned above. I'm seeking a solution for the issue I linked below. https://discussions.apple.com/thread/255668660?sortBy=best Apple Support said I could get a faster response. I've also submitted the issue to Apple Support, and they said it's currently with an Apple Engineer, but things are moving a bit slowly there. I'm writing the similar explanation I wrote on the discussion forum here as well. It's been months, and I hope we can get a result here: **Here is the problem: ** I've noticed that the "CursorUIViewService" process in Activity Monitor is becoming 'not responding' and causing significant lag on my MacBook Air (M3), especially when typing and switching between upper and lower case letters. It appears this process also controls the blue caps-lock indicator, which stops working when the process is unresponsive. This issue seems to cause the lag, and currently, it is using about 170MB of RAM. Additionally, the "com.apple.hiservices-xpcservice" process also becomes unresponsive , though it usually doesn't exceed 3.5MB of RAM. Actually, this process becomes 'not responding' much more frequently compared to the CursorUIViewService process. The possibility that it might be related to CursorUIViewService pushed me to research this issue as well. I can see that there have been complaints about this process for years, but it seems no solution is being produced. By the way, I've tried everything. I did a clean install, ran diagnostics, performed first aid, and still encountered the problem. Has anyone else experienced this issue or found a solution? As an update, I would like to inform you that the "com.apple.hiservices-xpcservice" process is still experiencing not responding issues with macOS Sequoia (15.0). However, because "cursoruiviewservice" was causing problems less often on Sonoma, I can't say the issue is completely resolved just yet; I need to monitor the situation. Thank you!
0
1
359
Sep ’24
Voiceover in Xcode 16 doesn't allow adding plist keys to info.plist anymore.
Hi all, This post is from a blind user on Reddit looking for assistance. He created a ticket in the feedback app but was hoping there was another solution. He is unable to post to these forums because VoiceOver doesn't allow him to set the tags appropriately. He asked that someone please post it here so he can get help. Below is his post: I’ve discovered another issue with VoiceOver in Xcode 16. We can’t add Plist keys to info.plist anymore. We can create a new row, but choosing a suggestion with VoiceOver is impossible. My current workaround is to add a random key, save the file, and then open it as source code. I can edit the file in the code editor, but I lose great autocompletion and Plist handling. This is slowing me down, and I’m very unhappy with it. I’d appreciate it if you could share this post widely. Hopefully, a solution will be found. Thanks everyone! (I can't link his original post here as it's not allowed.) In short, is there any trick to getting Voiceover to work with plist files in Xcode 16?
3
0
306
Sep ’24