Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Post

Replies

Boosts

Views

Activity

User interface in Ainu.
I have mulling over this for many years ,Uralic and Siberian language user interface Support.Ainu of Japan is only supported by writing roman and rendering into Katakana with a few small modified characters there is no user interface ,spell,grammar checker,dictionary ,translator ,of course the Ainu has few terms in modern vocabulary but Iam studying the language in order to find words and coin new ones, iPhone hoomi-ye-p electric speak thing. I am looking for other peple who have the same idea.
3
1
987
Dec ’16
Unreliable Accessibility API behavior when interacting with Microsoft Teams app
I'm developing a macOS app that interacts with Microsoft Teams using the Accessibility API. I've noticed inconsistent behavior when querying UI elements, particularly for the mute button. My queries often fail, while system tools like VoiceOver can consistently access these elements (which are visible on the screen). In some cases, it works well, but in others, the UI elements are not visible from my code. When I try Accessibility Inspector, it also initially fails to inspect. However, the Inspector seems to have some "magical" power that, when I run it or via AX audit, appears to refresh the AX tree, and then my code occasionally works as well. Given that VoiceOver can consistently read the screen, I assume the issue is not with the Microsoft Teams app itself (assuming it's based on Electron/React). I am mentioning this, because when I interact with Zoom app, reading the mute status from the app's menu bar, its 100% working anytime. What would you recommend I try or explore to improve reliability? Can I refresh the apps' AX tree from my end from swift? Is that a bug in AX API or even in Microsoft Teams? (have ready example and demo video, but it does not let me upload here)
2
0
414
Sep ’24
Accessibility VoiceOver issue for UIDatePicker
The react-native-date-picker is based on UIDatePicker on iOS. The UIDatePicker element is a wrapper containing other elements that the VoiceOver user can interact with (year, month, day and so on). I'm using react-native-date-picker with the UK date format(day month year). If I change day, such as 1 July 2024 to 2 July 2024, the VoiceOver says 2 of 12, but 12 is the month total. If I change month, such as 1 July 2024 to 1 August 2024, the VoiceOver says 8 of 31, but 31 is day total. It's weird. Please help me out on how to solve this issue or let me know if I am missing anything. iOS 16.3.1 react-native-date-picker 4.4.2
1
0
285
Sep ’24
Default localization fallback not used for untranslated string in catalog
For an iOS app, I'm using a String Catalog for localization. In the Localizable.xcstrings, I have all the Keys set with IDs, and default to English set, and other languages added. All of the English strings has some English text value assigned. The Key and the values are different. The Swift code only makes references to the Keys. As for the non-English languages, not all of the strings have been translated. Many of the fields are not set, i.e. it's in grey and showing the English text, and I can see the "new" tags for those untranslated strings in the String Catalog. All of that is good and expected. However, when I run the app in a non-English language, for the untranslated strings, I'm seeing the string Key instead of the English value. The screens in the app are shows Key IDs all over the place, along with some translated text, but nothing in default English. Is that how it's suppose to work? It seems like the "Default Localization (en)" should be shown. That's the whole point of having a fallback default language is it not? I understand pre-string catalogs, the fall back default is the key, but now I'd expect it to use the default language as backup since that's what's shown in the catalog.
4
0
409
Sep ’24
Receipt verification Verification failed with status VERIFICATION_FAILURE
Hello team, There is a Function,App create a order;When the app is all paid out,it sends me a credentials; I accept and verify; verify_and_decode_signed_transaction function have an error; requests.exceptions.ConnectionError: HTTPConnectionPool(host='ocsp.apple.com', port=80): Max retries exceeded with url: /ocsp03-applerootcag3 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x1123c7d00>: Failed to establish a new connection: [Errno 61] Connection refused')) The above exception was the direct cause of the following exception: appstoreserverlibrary.signed_data_verifier.VerificationException: Verification failed with status VERIFICATION_FAILURE def verify_purchase(self, request, *args, **kwargs): try: receipt_data = request.data.get('receipt_data') transaction_id = receiptUtilInstance.extract_transaction_id_from_app_receipt(receipt_data) sendResponse: TransactionInfoResponse = appleClientInstance.client.get_transaction_info(transaction_id) signedPayLoad: str = sendResponse.signedTransactionInfo time.sleep(3) payload = signedDataInstance.client.verify_and_decode_signed_transaction(signedPayLoad) please help me
1
0
385
Sep ’24
guidedAccessStatusDidChangeNotification does not get called on visionOS
I am trying to get a Notification if Guided access is enabled or disabled on the VisionPro. For doing so you would normally just call: NotificationCenter.default.addObserver(forName: UIAccessibility.guidedAccessStatusDidChangeNotification, object: nil, queue: .main){ noti in print("guided access did change") } and this works fine on iOS devices. But running the exact same code in Vision os Results in not getting a notification at all, even though Guided Access gets enabled or Disabled. For testing i ran a simple default app, that works perfectly on both os types. import SwiftUI struct ContentView: View { var body: some View { VStack { Image(systemName: "globe") .imageScale(.large) .foregroundStyle(.tint) Text("Hello, world!") } .onAppear{ print("is appeairng") NotificationCenter.default.addObserver(forName: UIAccessibility.guidedAccessStatusDidChangeNotification, object: nil, queue: .main){_ in print("guided access did change") } } .padding() } } But as said it prints "guided access did change on iOS" but not on the Vision Pro.
0
0
257
Sep ’24
iOS 18.1 Beta 5 bugs
I've just updated to iOS 18.1 Beta 5 update and I'm not able to see FaceTime audio/ video calls in the phone call log at all as it's like no where. And even in FaceTime there is no summary like call duration or recent calls log, just last call listed. Please resolve this issue and have FaceTime calls logs to be shown in the phone call logs as that was perfect and categorised.
1
0
752
Sep ’24
Personal Voice authorization requires app restart
I'm trying to include Apple's Personal Voice feature in an app I'm working on, but I want to use a button or toggle to request access, rather than firing the request on first launch. The problem is that, if AVSpeechSynthesizer is used during the same session, before Personal Voice is authorized, the app has to be restarted to use the feature. Here is a basic example that demonstrates the issue on my iPhone (running 18.1 beta, but the issue was present at least in 18.0, maybe before): import AVFoundation import SwiftUI struct TestView: View { let synthesizer = AVSpeechSynthesizer() @State private var personalVoices: [AVSpeechSynthesisVoice] = [] var body: some View { VStack(spacing: 100) { Text("Personal Voices Available: \(personalVoices.count)") Button { speakUtterance(string: "Hello, world!") } label: { Image(systemName: "hand.wave.fill") .font(.system(size: 100)) } Button("Fetch Personal Voices") { Task { await fetchPersonalVoices() } } } } func fetchPersonalVoices() async { AVSpeechSynthesizer.requestPersonalVoiceAuthorization() { status in if status == .authorized { personalVoices = AVSpeechSynthesisVoice.speechVoices().filter { $0.voiceTraits.contains(.isPersonalVoice) } } } } func speakUtterance(string: String) { let utterance = AVSpeechUtterance(string: string) if let voice = personalVoices.first { utterance.voice = voice } else { utterance.voice = AVSpeechSynthesisVoice(language: Locale.preferredLanguages[0]) } synthesizer.speak(utterance) } } If you tap the hand symbol first (before authorizing Personal Voice), you'll probably notice that the Personal Voices Available number never increases. If you authorize Personal Voice before tapping the hand symbol, it should speak using your Personal Voice as expected. The example code is mostly taken directly from this WWDC23 video (Personal Voice info begins around the 10-minute mark). Does anyone have any idea what could be causing this? Note: Personal Voice can't be tested in Simulator. The code will need to be run on a physical device that has Personal Voice set up, to test.
0
0
253
Sep ’24
Screen Capture on Mac 15 not working through SSH
Steps: Connect to Mac through ssh and execute “screencapture abc.jpg” It shows the attached popup, though we have added “ssh-keygen-wrapper” to control the computer in Privacy and Security->Accessibility and in Privacy and Security->Screen and System Audio recording in System Settings Ideally, we wouldn’t like to see this popup as we have given enough permissions to take the screenshot as mentioned above, same works fine in Mac14 and below versions Even upon clicking “Allow for one week”, new session of SSH will have this popup again.
0
0
174
Sep ’24
White screen in iOS18
Our app: https://apps.apple.com/it/app/bookapp-business/id1511129368 BookApp Business, does not work in iOS 18 when it starts it opens a white screen it is developed in Xamarin Forms and we cannot update it are there any solutions? tanks for support Alex
5
0
647
Sep ’24
In-app purchase: Commission and taxes
I’m writing to understand how taxes are applied to our subscription revenue and to clarify a potential issue we’re facing. For a ₹299 subscription: Apple deducts a 30% commission, leaving us with ₹209.30. After that, an additional tax of approximately 18% (₹36.11) is deducted by Apple, which reduces the amount further. When we receive this ₹173.19 in our bank, we are required to pay 18% GST again on this amount in India. This appears to result in us paying tax twice. Could you clarify why 18% tax is deducted by Apple, and if we are not required to pay it again, where can we see this adjustment on the GST portal?
0
0
266
Sep ’24
Disabling New Hand Gesture Features in Vision Pro App on visionOS 2
Question: Hi everyone, I'm developing a Vision Pro app using the latest visionOS 2, and I've encountered some issues with the new hand gestures introduced in this update. My app is designed to display a UI element when a user's palm is detected. However, the new hand gestures for navigating key functions like Home View, Control Center, and adjusting the volume are interfering with my app's functionality. What I'm Trying to Achieve Detect when a user's palm is open and display a UI element. Ensure that my app's custom hand gestures are not disturbed by the new default gestures in visionOS 2. Problem The new hand gestures in visionOS 2 (such as those for Home View, Control Center, and volume adjustment) are activating while my app is open, causing disruptions to my app's functionality. I want to disable these system-level gestures when my app is running.
3
2
1.2k
Jun ’24