Can it please be explained why the voice memo layers is exclusive to iPhone 16 pro models?
I was told by apple support that if it was possible on older models Apple would have implemented it.
This type of functionality has been in technology for tens of years and has a proven track record of being implemented across so many different platforms and even apps in Apples on App Store. So that response from apple support is just rubbish to be honest. Even if the response is “so we can sell the iPhone 16 pro“ then so be it. ive noticed Apple locking software features to new devices lately and it’s genuinely making me think about buying my first even android phone.. because I really don’t support these kind of scummy tactics.
many thanks,
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Post
Replies
Boosts
Views
Activity
XCode 16.0 : An Internal Error Occurred Editing functionality may be limited
After updating to iOS 18, I've noticed that my app's icon displays correctly on the home screen but appears with a black background in the 'Settings > Apps' section when Dark Mode is active.
The icon currently has a non-transparent gradient orange background, and I haven't set up a separate icon for Dark Mode. Has anyone else experienced this issue, or does anyone have suggestions for how to ensure the app icon looks the same in Dark Mode as it does in Light Mode?
mac book air 2016 will not switch on just loading apple screen comes up
please assist me anyone.
1080211AWDLPeerManager: :setAwd|SuspendedMode() Suspending AWDL, enterQuietMade (true)
1080211AWDLPeerManager:: setAwdlOperatingMode Setting the AWDL operation mode from SUSPENDED to AUTO
1080211AWDLPeerManager: :setAwd|AutoMode Resuming AWDL
1080211AWDLPeerManager: : setAwdlOperatingMode Setting the AWDL operation made from AUTO to SUSPENDED
1080211AWDLPeerManager: :setAwd|SuspendedMode() Suspending AWDL, enterQuietMode(true)
1080211AWDLPeer Manager: : setAwdlOperat ingMode Setting the AWDL operation mode from SUSPENDED to AUTO
1080211AWDLPeerManager: :setAwd|AutoMode Resuming AWDL
1080211AWDLPeerManager: :setAwdlOperatingMode Setting the AWDL operation made from AUTO to SUSPENDED
1080211AWDLPeerManager: :setAwdlSuspendedMode() Suspending AWDL, enterQuietlade (true)
1080211AWDLPeerManager: :setAwdLOperatingMode Setting the AWDL operation mode from SUSPENDED to AUTO
1080211AWDLPeerManager: :setAwdLAutoMode Resuming AWDL
1080211AWDLPeerManager: :setAwdLOperatingMode Setting the AWDL operation mode from AUTO to SUSPENDED
I080211AWDLPeerManager: :setAwdlSuspendedMode() Suspending AWDL, enterQuietMode (true)
1080211AWDLPeerManager:: setAwdlOperatingMode Setting the AWDL operation mode from SUSPENDED to AUTO
1080211AWDLPeerManager: :setAwdlAutoMode Resuming AWDL
1080211AWDLPeerManager: :setAwd LOperatingMode Setting the AWDL operation mode from AUTO to SUSPENDED
1080211AWDLPeerManager ::setAwdlSuspendedMode( Suspending AWDL,
enterQuietMode (true)
1080211AWDLPeerManager: :setAwdLOperatingMode Setting the AWDL operation mode from SUSPENDED to AUTO
1080211AWDLPeerManager: :setAwdlAutoMode Resuming AWDL
1080211AWDLPeerManager: :setAwdlOperatingMode Setting the AWDL operation mode from AUTO to SUSPENDED
1080211AWDLPeerManager:: setAwdlSuspendedMode() Suspending AWDL, enterQuietMode(true)
1080211AWDLPeerManager: :setAwdLOperatingMode Setting the AWDL operation mode from SUSPENDED to AUTO
1080211AWDLPeerManager : :setAwdlAutoMode Resuming AWDL
1080211AWDLPeerManager: :setAwdLOperatingMode Setting the AWDL operation mode from AUTO to SUSPENDED
1080211AWDLPeerManager ::setAwdlSuspendedMode() Suspending AWDL, enterQuietMode (true)
1080211AWDLPeer Manager: :setAwdlOperatingMode Setting the AWDL operation mode from SUSPENDED to AUTO
1080211AWDLPeerManager: :setAwdLAutoMode Resuming AWDL
1080211AWDLPeerManager ::setAwdlOperatingMode Setting the AWDL operation mode from AUTO to SUSPENDED
1080211AWDLPeerManager: : setAwd|SuspendedMode ()
Suspending ANDL, enterQuietMode (true)
1080211AWDLPeerManager: :setAwdlOperatingMode Setting the AWDL operation mode from SUSPENDED to AUTO
1080211AWDLPeerManager: :setAwdlAutoMode Resuming AWDL
1080211AWDLPeerManager: :setAwdLOperatingMode Setting the AWDL operation mode from AUTO to SUSPENDED
I have 2 subscriptions, monthly and year
first:DID_CHANGE_RENEWAL_PREF + DOWNGRADE: Customer downgrades a subscription within the same subscription group; The current subscribe is year;if i change to month;when the subscribe year was expire,Automatic renewal will change to month
second:DID_CHANGE_RENEWAL_PREF+UPGRADE: Customer upgrades a subscription within the same subscription group. The current subscribe is month;if i change to year;I need to pay for the annual subscription immediately;and The subscription immediately switches to the annual
third:DID_CHANGE_RENEWAL_PREF subtype is None Customer reverts to the previous subscription, effectively canceling their downgrade.
what this mean? This is Test Env,month is five minutes;year is one hour ①My current subscription is an annual, startTime:2024-10-02 15:04:58 ,expireTime:2024-10-02 16:04:58 ②first DOWNGRADE to month,at 2024-10-02 15:14:16 ②after 38 minutes,I change to the annual subscribe;at 2024-10-02 15:52:00 in the end,the Notification purchaseDate 2024-10-02 15:52:00;expiresDate 2024-10-02 16:52:00; So When I get NotificationType=DID_CHANGE_RENEWAL_PREF,NotificationSubType=None,Do I need to create a new subscription for the users? Is it the latest notice of purchaseDate and expiresDate, for a year?
The appleNotification Payload as follows:
JWSTransactionDecodedPayload( originalTransactionId='2000000731045285', transactionId='2000000731088945', webOrderLineItemId='2000000076096676', bundleId='app.xxxx', productId='com.xxxx.365', subscriptionGroupIdentifier='21514251', purchaseDate=1727855520000, 2024-10-02 15:52:00 originalPurchaseDate=1727852699000, 2024-10-02 15:04:59 expiresDate=1727859120000, 2024-10-02 16:52:00 quantity=1, type=<Type.AUTO_RENEWABLE_SUBSCRIPTION: 'Auto-Renewable Subscription'>, rawType='Auto-Renewable Subscription', appAccountToken='fa37b7a2-2b0b-43cb-8fda-a1fb21168efe', inAppOwnershipType=<InAppOwnershipType.PURCHASED: 'PURCHASED'>, rawInAppOwnershipType='PURCHASED', signedDate=1727855526632, 2024-10-02 15:52:06 revocationReason=None, rawRevocationReason=None, revocationDate=None, isUpgraded=None, offerType=None, rawOfferType=None, offerIdentifier=None, environment=<Environment.SANDBOX: 'Sandbox'>, rawEnvironment='Sandbox', storefront='CAN', storefrontId='143455', transactionReason=<TransactionReason.PURCHASE: 'PURCHASE'>, rawTransactionReason='PURCHASE', currency='CAD', price=14990, offerDiscountType=None, rawOfferDiscountType=None)
JWSRenewalInfoDecodedPayload( expirationIntent=None, rawExpirationIntent=None, originalTransactionId='2000000731045285', autoRenewProductId='com.xxxx.365', productId='com.xxxx.365', autoRenewStatus=<AutoRenewStatus.ON: 1>, rawAutoRenewStatus=1, isInBillingRetryPeriod=None, priceIncreaseStatus=None, rawPriceIncreaseStatus=None, gracePeriodExpiresDate=None, offerType=None, rawOfferType=None, offerIdentifier=None, signedDate=1727855526632, 2024-10-02 15:52:06 environment=<Environment.SANDBOX: 'Sandbox'>, rawEnvironment='Sandbox', recentSubscriptionStartDate=1727852698000, 2024-10-02 15:04:58 renewalDate=1727859120000, 2024-10-02 16:52:00 currency='CAD', renewalPrice=14990, offerDiscountType=None, rawOfferDiscountType=None, eligibleWinBackOfferIds=None)
Boost
Unwatched post
I have mulling over this for many years ,Uralic and Siberian language user interface Support.Ainu of Japan is only supported by writing roman and rendering into Katakana with a few small modified characters there is no user interface ,spell,grammar checker,dictionary ,translator ,of course the Ainu has few terms in modern vocabulary but Iam studying the language in order to find words and coin new ones, iPhone hoomi-ye-p electric speak thing. I am looking for other peple who have the same idea.
Same problem as everyone getting after IOS 18update.
I'm developing a macOS app that interacts with Microsoft Teams using the Accessibility API. I've noticed inconsistent behavior when querying UI elements, particularly for the mute button. My queries often fail, while system tools like VoiceOver can consistently access these elements (which are visible on the screen).
In some cases, it works well, but in others, the UI elements are not visible from my code. When I try Accessibility Inspector, it also initially fails to inspect. However, the Inspector seems to have some "magical" power that, when I run it or via AX audit, appears to refresh the AX tree, and then my code occasionally works as well.
Given that VoiceOver can consistently read the screen, I assume the issue is not with the Microsoft Teams app itself (assuming it's based on Electron/React). I am mentioning this, because when I interact with Zoom app, reading the mute status from the app's menu bar, its 100% working anytime.
What would you recommend I try or explore to improve reliability?
Can I refresh the apps' AX tree from my end from swift?
Is that a bug in AX API or even in Microsoft Teams?
(have ready example and demo video, but it does not let me upload here)
The react-native-date-picker is based on UIDatePicker on iOS. The UIDatePicker element is a wrapper containing other elements that the VoiceOver user can interact with (year, month, day and so on).
I'm using react-native-date-picker with the UK date format(day month year). If I change day, such as 1 July 2024 to 2 July 2024, the VoiceOver says 2 of 12, but 12 is the month total. If I change month, such as 1 July 2024 to 1 August 2024, the VoiceOver says 8 of 31, but 31 is day total. It's weird.
Please help me out on how to solve this issue or let me know if I am missing anything.
iOS 16.3.1
react-native-date-picker 4.4.2
For an iOS app, I'm using a String Catalog for localization.
In the Localizable.xcstrings, I have all the Keys set with IDs, and default to English set, and other languages added. All of the English strings has some English text value assigned. The Key and the values are different. The Swift code only makes references to the Keys.
As for the non-English languages, not all of the strings have been translated. Many of the fields are not set, i.e. it's in grey and showing the English text, and I can see the "new" tags for those untranslated strings in the String Catalog.
All of that is good and expected.
However, when I run the app in a non-English language, for the untranslated strings, I'm seeing the string Key instead of the English value. The screens in the app are shows Key IDs all over the place, along with some translated text, but nothing in default English.
Is that how it's suppose to work?
It seems like the "Default Localization (en)" should be shown. That's the whole point of having a fallback default language is it not?
I understand pre-string catalogs, the fall back default is the key, but now I'd expect it to use the default language as backup since that's what's shown in the catalog.
Hello team,
There is a Function,App create a order;When the app is all paid out,it sends me a credentials; I accept and verify;
verify_and_decode_signed_transaction function have an error;
requests.exceptions.ConnectionError: HTTPConnectionPool(host='ocsp.apple.com', port=80): Max retries exceeded with url: /ocsp03-applerootcag3 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x1123c7d00>: Failed to establish a new connection: [Errno 61] Connection refused'))
The above exception was the direct cause of the following exception:
appstoreserverlibrary.signed_data_verifier.VerificationException: Verification failed with status VERIFICATION_FAILURE
def verify_purchase(self, request, *args, **kwargs):
try:
receipt_data = request.data.get('receipt_data')
transaction_id = receiptUtilInstance.extract_transaction_id_from_app_receipt(receipt_data)
sendResponse: TransactionInfoResponse = appleClientInstance.client.get_transaction_info(transaction_id)
signedPayLoad: str = sendResponse.signedTransactionInfo
time.sleep(3)
payload = signedDataInstance.client.verify_and_decode_signed_transaction(signedPayLoad)
please help me
So I got the iPhone 15 I also have the beta update. I’m curious as to whether i will be able to get apple intelligence
I am trying to get a Notification if Guided access is enabled or disabled on the VisionPro.
For doing so you would normally just call:
NotificationCenter.default.addObserver(forName: UIAccessibility.guidedAccessStatusDidChangeNotification, object: nil, queue: .main){ noti in
print("guided access did change")
}
and this works fine on iOS devices.
But running the exact same code in Vision os Results in not getting a notification at all, even though Guided Access gets enabled or Disabled.
For testing i ran a simple default app, that works perfectly on both os types.
import SwiftUI
struct ContentView: View {
var body: some View {
VStack {
Image(systemName: "globe")
.imageScale(.large)
.foregroundStyle(.tint)
Text("Hello, world!")
}
.onAppear{
print("is appeairng")
NotificationCenter.default.addObserver(forName: UIAccessibility.guidedAccessStatusDidChangeNotification, object: nil, queue: .main){_ in
print("guided access did change")
}
}
.padding()
}
}
But as said it prints "guided access did change on iOS" but not on the Vision Pro.
Hey!
Any plans on supporting Photogrammetry Sessions on VisionOS?
https://developer.apple.com/documentation/realitykit/photogrammetrysession
Thanks!
I've just updated to iOS 18.1 Beta 5 update and I'm not able to see FaceTime audio/ video calls in the phone call log at all as it's like no where. And even in FaceTime there is no summary like call duration or recent calls log, just last call listed.
Please resolve this issue and have FaceTime calls logs to be shown in the phone call logs as that was perfect and categorised.
I am an iphone 11 user and i got iphone 16 when iphone 16 was launched .iphone 16 ios 18.1 beta version software updated i got same battery life as iphone 11
I'm trying to include Apple's Personal Voice feature in an app I'm working on, but I want to use a button or toggle to request access, rather than firing the request on first launch. The problem is that, if AVSpeechSynthesizer is used during the same session, before Personal Voice is authorized, the app has to be restarted to use the feature.
Here is a basic example that demonstrates the issue on my iPhone (running 18.1 beta, but the issue was present at least in 18.0, maybe before):
import AVFoundation
import SwiftUI
struct TestView: View {
let synthesizer = AVSpeechSynthesizer()
@State private var personalVoices: [AVSpeechSynthesisVoice] = []
var body: some View {
VStack(spacing: 100) {
Text("Personal Voices Available: \(personalVoices.count)")
Button {
speakUtterance(string: "Hello, world!")
} label: {
Image(systemName: "hand.wave.fill")
.font(.system(size: 100))
}
Button("Fetch Personal Voices") {
Task { await fetchPersonalVoices() }
}
}
}
func fetchPersonalVoices() async {
AVSpeechSynthesizer.requestPersonalVoiceAuthorization() { status in
if status == .authorized {
personalVoices = AVSpeechSynthesisVoice.speechVoices().filter { $0.voiceTraits.contains(.isPersonalVoice) }
}
}
}
func speakUtterance(string: String) {
let utterance = AVSpeechUtterance(string: string)
if let voice = personalVoices.first {
utterance.voice = voice
} else {
utterance.voice = AVSpeechSynthesisVoice(language: Locale.preferredLanguages[0])
}
synthesizer.speak(utterance)
}
}
If you tap the hand symbol first (before authorizing Personal Voice), you'll probably notice that the Personal Voices Available number never increases. If you authorize Personal Voice before tapping the hand symbol, it should speak using your Personal Voice as expected.
The example code is mostly taken directly from this WWDC23 video (Personal Voice info begins around the 10-minute mark).
Does anyone have any idea what could be causing this?
Note: Personal Voice can't be tested in Simulator. The code will need to be run on a physical device that has Personal Voice set up, to test.
Steps:
Connect to Mac through ssh and execute “screencapture abc.jpg”
It shows the attached popup, though we have added “ssh-keygen-wrapper” to control the computer in Privacy and Security->Accessibility and in Privacy and Security->Screen and System Audio recording in System Settings
Ideally, we wouldn’t like to see this popup as we have given enough permissions to take the screenshot as mentioned above, same works fine in Mac14 and below versions
Even upon clicking “Allow for one week”, new session of SSH will have this popup again.
I want to turn on the grayscale mode of iphone from ios app.
Is it possible to do ?
i'm trying to use tthis feature with voice over, but each time i try it uses my system voice.
how can i fix this.
i sent a report to apple.
my id is FB15265988