I am developing IOS application and trying to get the connected wifi information but i am only getting the status of the connection. I need more info e.g SSID. Pleas guide how can i get all the wifi information
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Post
Replies
Boosts
Views
Activity
Symbolicating Error Log :
RCTFormatError (in MyApp) (RCTAssert.m:167)
-[RCTExceptionsManager reportFatal:stack:exceptionId:extraDataAsJSON:] (in MyApp) (RCTExceptionsManager.mm:82)
-[RCTExceptionsManager reportException:] (in MyApp) (RCTExceptionsManager.mm:154)
-[RCTModuleMethod invokeWithBridge:module:arguments:] (in MyApp) (RCTModuleMethod.mm:587)
facebook::react::invokeInner(RCTBridge*, RCTModuleData*, unsigned int, folly::dynamic const&, int, (anonymous namespace)::SchedulingContext) (in MyApp) (RCTNativeModule.mm:197)
invocation function for block in facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int) (in MyApp) (RCTNativeModule.mm:105)
main (in MyApp) (main.m:8)
FIRCLSTerminateHandler() (in MyApp) (FIRCLSException.mm:0)
FIRCLSMachExceptionServer (in MyApp) (FIRCLSMachException.c:168)
+[RCTCxxBridge runRunLoop] (in MyApp) (RCTCxxBridge.mm:0)
facebook::hermes::inspector::detail::SerialExecutor::runLoop() (in MyApp) (SerialExecutor.cpp:41)
void* std::__1::__thread_proxy[abi:v160006]<std::__1::tuple<std::__1::unique_ptr<std::__1::__thread_struct, std::__1::default_deletestd::__1::__thread_struct>, void ()(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>, std::__1::function<void ()>), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>, std::__1::function<void ()>>>(void) (in MyApp) (thread:299)
facebook::hermes::inspector::detail::SerialExecutor::runLoop() (in MyApp) (SerialExecutor.cpp:41)
void* std::__1::__thread_proxy[abi:v160006]<std::__1::tuple<std::__1::unique_ptr<std::__1::__thread_struct, std::__1::default_deletestd::__1::__thread_struct>, void ()(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>, std::__1::function<void ()>), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>, std::__1::function<void ()>>>(void) (in MyApp) (thread:299)
facebook::hermes::inspector::detail::SerialExecutor::runLoop() (in MyApp) (SerialExecutor.cpp:41)
void* std::__1::__thread_proxy[abi:v160006]<std::__1::tuple<std::__1::unique_ptr<std::__1::__thread_struct, std::__1::default_deletestd::__1::__thread_struct>, void ()(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>, std::__1::function<void ()>), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>, std::__1::function<void ()>>>(void) (in MyApp) (thread:299)
facebook::hermes::inspector::detail::SerialExecutor::runLoop() (in MyApp) (SerialExecutor.cpp:41)
void* std::__1::__thread_proxy[abi:v160006]<std::__1::tuple<std::__1::unique_ptr<std::__1::__thread_struct, std::__1::default_deletestd::__1::__thread_struct>, void ()(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>, std::__1::function<void ()>), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>, std::__1::function<void ()>>>(void) (in MyApp) (thread:299)
STDERR:
Replacing matches in sample/spindump report: 2 MyApp 0x10531419c 0x104f54000 + 3932572
Replacing matches in sample/spindump report: 3 MyApp 0x104f6bcd4 0x104f54000 + 97492
Replacing matches in sample/spindump report: 4 MyApp 0x104f6c4e8 0x104f54000 + 99560
Replacing matches in sample/spindump report: 8 MyApp 0x105343b3c 0x104f54000 + 4127548
Replacing matches in sample/spindump report: 9 MyApp 0x105345b8c 0x104f54000 + 4135820
Replacing matches in sample/spindump report: 10 MyApp 0x1053457dc 0x104f54000 + 4134876
Replacing matches in sample/spindump report: 10 MyApp 0x104f5a8bc 0x104f54000 + 26812
Replacing matches in sample/spindump report: 7 MyApp 0x104fd6820 0x104f54000 + 534560
Replacing matches in sample/spindump report: 4 MyApp 0x104fdf198 0x104f54000 + 569752
Replacing matches in sample/spindump report: 7 MyApp 0x105326414 0x104f54000 + 4006932
Replacing matches in sample/spindump report: 3 MyApp 0x1057c64d8 0x104f54000 + 8856792
Replacing matches in sample/spindump report: 4 MyApp 0x105787e08 0x104f54000 + 8601096
Replacing matches in sample/spindump report: 3 MyApp 0x1057c64d8 0x104f54000 + 8856792
Replacing matches in sample/spindump report: 4 MyApp 0x105787e08 0x104f54000 + 8601096
Replacing matches in sample/spindump report: 3 MyApp 0x1057c64d8 0x104f54000 + 8856792
Replacing matches in sample/spindump report: 4 MyApp 0x105787e08 0x104f54000 + 8601096
Replacing matches in sample/spindump report: 3 MyApp 0x1057c64d8 0x104f54000 + 8856792
Replacing matches in sample/spindump report: 4 MyApp
**Please advise.
thank you have a good day.**
Hi, sorry if the sentence is wrong as I am using the translate function.
I am looking to create an AR application using Nearby Interaction and ARkit.
The devices to be used are iPhone 15pro and apple watch 9.
The iPhone implementation will measure the distance to the apple watch while simultaneously using AR to track images.
The apple watch implementation will only measure the distance to the iPhone.
Is it possible to implement this feature?
If you know of any, I would like to know.
if UIImagePickerController.isSourceTypeAvailable(.camera) {
let imagePicker = UIImagePickerController()
imagePicker.delegate = self
imagePicker.allowsEditing = false
imagePicker.sourceType = .camera
self.present(imagePicker, animated: true, completion: nil)
}
this code crashes on M2 Mac (Designed for iPad) with the following exception
<<<< FigCaptureCameraParameters >>>> Fig assert: "success" at bail (FigCaptureCameraParameters.m:249) - (err=0)
An uncaught exception was raised
*** -[__NSPlaceholderDictionary initWithObjects:forKeys:count:]: attempt to insert nil object from objects[0]
(
0 CoreFoundation 0x0000000180b02800 __exceptionPreprocess + 176
1 libobjc.A.dylib 0x00000001805f9eb4 objc_exception_throw + 60
2 CoreFoundation 0x0000000180a1a724 -[__NSPlaceholderDictionary initWithObjects:forKeys:count:] + 728
3 CoreFoundation 0x0000000180a1a420 +[NSDictionary dictionaryWithObjects:forKeys:count:] + 52
4 AVFCapture 0x000000019de90374 -[AVCaptureFigVideoDevice _cameraInfo] + 200
5 AVFCapture 0x000000019de90278 -[AVCaptureFigVideoDevice updateStreamingDeviceHistory] + 36
6 AVFCapture 0x000000019deec8c0 -[AVCaptureSession _startFigCaptureSession] + 464
7 AVFCapture 0x000000019def0980 -[AVCaptureSession _buildAndRunGraph:] + 1936
8 AVFCapture 0x000000019deecc00 -[AVCaptureSession _setRunning:] + 120
9 AVFCapture 0x000000019deec46c -[AVCaptureSession startRunning] + 452
10 libRPAC.dylib 0x00000001051c9024 _replacement_AVCaptureSession_startRunning + 104
11 libdispatch.dylib 0x000000010509cf14 _dispatch_call_block_and_release + 32
12 libdispatch.dylib 0x000000010509eb4c _dispatch_client_callout + 20
13 libdispatch.dylib 0x00000001050a7cd8 _dispatch_lane_serial_drain + 864
14 libdispatch.dylib 0x00000001050a8dcc _dispatch_lane_invoke + 416
15 libdispatch.dylib 0x00000001050b877c _dispatch_root_queue_drain_deferred_wlh + 652
16 libdispatch.dylib 0x00000001050b7a54 _dispatch_workloop_worker_thread + 444
17 libsystem_pthread.dylib 0x0000000105147d9c _pthread_wqthread + 288
18 libsystem_pthread.dylib 0x000000010514fab4 start_wqthread + 8
)
Hey there 👋,
My team and me have implemented support for WAC devices in our App using EAWiFiUnconfiguredAccessoryBrowser.
When the device we want to support is factory reset we are able to find it as well as the OS is able to.
The device has a "setup wifi" button, which starts the WAC process, and iOS (as well as MacOS) are able to find it. Unfortunately we are not able to find it in that case using EAWiFiUnconfiguredAccessoryBrowser..
I could not find any restrictions on it in the documentation, any glues why we are not able to detect it in that situation?
I isolated the problem in a sample and used the following implementation to test this:
import Foundation
import ExternalAccessory
import os
import Combine
class WACWatcher: NSObject {
private let browser: EAWiFiUnconfiguredAccessoryBrowser
private let logger = Logger(subsystem: "WACWatcher", category: "networking")
@Published var accessories: [EAWiFiUnconfiguredAccessory] = []
override init() {
self.browser = EAWiFiUnconfiguredAccessoryBrowser()
super.init()
self.browser.delegate = self
}
func start() {
browser.startSearchingForUnconfiguredAccessories(matching: nil)
}
func stop() {
browser.stopSearchingForUnconfiguredAccessories()
}
}
extension WACWatcher: EAWiFiUnconfiguredAccessoryBrowserDelegate {
func accessoryBrowser(
_ browser: EAWiFiUnconfiguredAccessoryBrowser,
didUpdate state: EAWiFiUnconfiguredAccessoryBrowserState
) {
switch state {
case .wiFiUnavailable:
logger.debug("WAC Browser state changed to wifiUnavailable")
break
case .stopped:
logger.debug("WAC Browser state changed to stopped")
break
case .searching:
logger.debug("WAC Browser state changed to searching")
break
case .configuring:
logger.debug("WAC Browser state changed to configuring")
break
}
}
func accessoryBrowser(
_ browser: EAWiFiUnconfiguredAccessoryBrowser,
didFindUnconfiguredAccessories accessories: Set<EAWiFiUnconfiguredAccessory>
) {
logger.info("WACWatcher found accessories: \(accessories)")
self.accessories.append(contentsOf: accessories)
}
func accessoryBrowser(
_ browser: EAWiFiUnconfiguredAccessoryBrowser,
didRemoveUnconfiguredAccessories accessories: Set<EAWiFiUnconfiguredAccessory>
) {
logger.info("WACWatcher removed accessories: \(accessories)")
self.accessories = accessories.filter({ accessory in
accessories.contains(where: { $0.name == accessory.name })
})
}
func accessoryBrowser(
_ browser: EAWiFiUnconfiguredAccessoryBrowser,
didFinishConfiguringAccessory accessory: EAWiFiUnconfiguredAccessory,
with status: EAWiFiUnconfiguredAccessoryConfigurationStatus
) {}
}
This WACWatcher gets used in a ViewModel and a view having a start stop button and a list showing the device name. If you'd need to see it, I can zip it and attach it to this post.
We were still unable to review your app as it crashed on launch. We have attached detailed crash logs to help troubleshoot this issue.
While we appreciate that your app is intended primarily for use on iPhone, in order to bring your app into compliance with App Store guidelines, all apps designed for use on iPhone must still be formatted correctly and behave properly when run on iPad.
Review device details:
Device type: iPad
OS version: iOS 17.2
crashlog-82A3D8C6-0136-4C68-A006-6A4AE49B1D1C.txt
crashlog-DBA3D1B0-5760-4B47-8FE7-DE44C570271A.txt
crashlog-C30814D5-13BC-40FE-92EC-89E736FA7F1B.txt
my app crashed only on iPad mini 6th generation its run fine on all other iPad's
Hi there!
Having gone through the "Accessory Design Guidelines for Apple Devices" Release 21 for the material requirement, I got a questions about the permittivity required. On page 28, it is saying that "Materials with high dielectric (permittivity >5 F/m)" should be avoided. May I know if it means RELATIVE permittivity? Since permittivity of most of the materials including metals are with e-12 which is already much less than 1. Thus, it would be more make sense 5 is related to RELATIVE permittivity instead of permittivity. Otherwise, I am afraid most metals could also be used as they are always less than 5 F/m. Or do I miss something? Appreciate all kinds of help. Cheers~
I am receiving an error message trying to connect an apple watch to my phone. Error message reads:
Could not sign in
Operation throttled by server. Retry after 3599 seconds (other operations may be allowed)
Has anyone experienced this error message before and know how to resolve it?
Apple Watch SE
Affected platform
apple.com and Apple Services
Affected area
Authentication Bypass
What is required to reproduce the issue?
A thief steals your cell phone and also obtains your pin against your will.
Requirements:
Enhanced Customer Support Protocols:
Establish a dedicated support channel for victims of theft and identity-related incidents, ensuring expedited and specialized assistance.
Implement stringent identity verification measures to safeguard user accounts during recovery processes.
Geolocation Data Preservation:
Modify policies to retain geolocation data even if "Find My iPhone" is disabled at the time of the incident, recognizing that initial settings may differ from the situation at the time of theft.
Wait 24 or 48 Hours, or even a week, after a request to turn off "Find My iPhone" to really turn it off. Giving time to the victim to act in consequence.
Strengthening 2FA Security:
Introduce additional layers of authentication for sensitive actions like changing passwords, disabling "Find My iPhone," or making significant account alterations.
Improved Reporting and Tracking:
Enhance reporting mechanisms for stolen devices, ensuring that user information is promptly relayed to relevant authorities. Use alternative ways to locate the phone if the theft is recognized by the authorities.
Enable a comprehensive tracking system for devices, allowing for efficient cooperation with law enforcement agencies.
Summary:
This feedback report addresses the critical need for Apple to enhance its support mechanisms for victims of stolen devices and identity theft. The proposed changes aim to provide users with swift and effective assistance during distressing situations while reinforcing the security of Apple accounts.
Steps to Reproduce:
User reports a stolen device, including relevant details such as the incident's time, location, and any available information about the perpetrator.
Apple's dedicated support channel verifies the user's identity through robust authentication procedures.
Apple collaborates with law enforcement, utilizing geolocation data and other available information to aid in the recovery of the stolen device and prevent unauthorized access to the user's accounts.
Additional authentication steps are implemented to thwart unauthorized changes to the account, especially in cases involving potential identity theft.
Expected Results:
Users in distress due to stolen devices or identity-related incidents receive prompt, empathetic, and specialized assistance from Apple's support team. Improved security measures deter unauthorized access and changes to user accounts.
Actual Results:
Current support mechanisms fall short in adequately addressing the needs of users facing theft and identity-related issues. Instances such as disabling "Find My iPhone" should not undermine Apple's commitment to user security.
This feedback report aims to underscore the urgency of implementing reforms that align with Apple's commitment to user security, empathy, and customer satisfaction. We trust that these proposed changes will contribute to a more secure and supportive environment for Apple users facing distressing situations. Your attention to this matter is greatly appreciated.
Sincerely,
Antonio, IT Engineer.
Hi,
Is it technically possible to stream external BT device's microphone input to iPhones speakers output?
Recently, I hid 800+ photos. Now, I'm trying to put them back in an album, everytime I attempt it doesn't put them back in album. How do I fix this problem.
P.S i was just watching family guy after not watching it for a month. this show is so good. i might shed a tear
P.S.S iPhone 13
I'm in desperate need of help here...
I updated my 2020 MacBook Air from Big Sur to Sonoma and in the process, it seems to have deleted hundreds of files from folders in my desktop. These files were seemingly stored on iCloud as from memory they had the cloud icon beside them and I would often have to redownload them before use. After updating, the folders are still on my desktop but the contents within them are entirely gone.
I must also mention that a new folder called 'Relocated Items' has popped up on my desktop but it does not contain anything except for folders within folders.
A quick look at online suggestions such as using the finder application to locate them in the iCloud Drive was unsuccessful as they are not visible there either. I don't have a time machine backup of these files as they were seemingly being stored in the iCloud prior to this update.
I can't stress enough how important it is that I recover these files as they are all related to my work and contain weeks upon weeks of labour. Any words of wisdom would be much appreciated!!
This issue has happened to me on 3 separate occasions now over the last few months. My phone will out of the blue freeze/glitch on the lock screen and the swipe up will not work properly. My lock screen and notification screen will mix together and glitch out. I am sometimes able to swipe up and unlock the phone however i will be stuck on the last app used and only be able to use that app. I cannot swipe up onto any home screen for other apps. My keyboard will also not appear and my control center will not work properly i.e cannot go on camera, calculator, screen record, turn off low power mode etc. Finally my battery percentage will be frozen in the top of my screen. The first time this happened i removed my screen protector but this didn't work, it eventually fixed itself after about an hour. The second time this happened it did not fix itself for a few hours. My phone is currently having the same issue and has not fixed itself after 2 hours. I cannot restart the phone as the power off option won't appear, and when i press the volume buttons quickly and hold the power button nothing happens. I also receive 0 notifications when this happens, but when it fixes itself the battery % will display correctly and all my notifications will come through. Would love some help on how to fix this when it happens.
Post iOS 17 version update on Dec 12, we are facing a weird issue. When accessing https://infinitiusa.com/ website from Chrome browser/FaceBook App, there is an dummy overlay with a 0 sec video player (Overlaying full screen as blank). On closing the video player we can see the website is loaded fine.
We cross checked few other websites and found the websites using Modernizr (https://cdnjs.com/libraries/modernizr) is having the issue post iOS update.
Is any issue similar issue reported ?
Hello,
When sometimes have bad values when reading the AbsoluteAltitude of the Watch. I'm running WatchOS 10.2 and it seems to be a new issue/bug.
When I print out the debugDescription of the received data, I got a statusInfo field at 0 when it works, and at 2 when not. But I didn't find any explanation of the value nor how to access it.
// OK: 0
Optional(AbsoluteAltitude: 223.266052, Accuracy: 6.944067, Precision 0.500000, statusInfo: 0, timestamp :726142069.230969)
// NOT OK: 2
Optional(AbsoluteAltitude: 121.610824, Accuracy: 3.156758, Precision 5.000000, statusInfo: 2, timestamp :726152607.239057)
Have someone experienced similar issues or now what this field is for?
Thx and greetings!
Printer position now shows as a question mark on menu bar. Updated the Canon Driver I thought I needed.
Still nothing.
Printer is in Settings but not visible or can't be selected. Help!!!
I have had this issue for many many months and decided to reach out here to see if anybody else has had and fixed this problem. I can compete the purchase on the Mac. I can make a full price purchase using the Apple Card on the Apple Store App.
I do not have Wifi at home in Japan because my iMac 2012 is too old for Japanese 2023,4 router and NTT docomo does not support old Mac or apple even soft bank has cellular 5G call capture apple that goes into encrypted network append locks it by passing my log in password ,this ia criminal but foreigner do not have any rights under Japanese law even Japan Apple call center sends me to English also I wish to say happy new year and congratulations that Apple developer is into apps but my2012 iMac Catalina OSX may not get full appreciation use I am still learning Ainu language and heard Ainu language researching Japan are developing making localization iOS/X bi-lingual interface for Ainu and dictionary predictive texts off iPhone also Huawei too for smart phone and Tungus Siberian lang page support with AI patterned my AI NU of nu =write see hear in Ainu I am interested in AI to make voice language in Ainu to teach talk Ainu learn Ainu and develop iOS /x for Ainu language and Nivkhi of Sakhalin and code in Ainu in the future AI will be able to write bi-lingual AINU TO...instruction code to make my own OS in Ainu language and also language and computer dictionary of computer terminology in Ainu bi-lingual and bi-lingual Ainu predictive text translated with voice so I can write to you in Ainu and art will show English translation word for word and this will be used in the Ainu bi-lingual predictive chat App and dictionary grammar 100 percent Ainu Computer and software code with dictionary not only me but millions of people with minority languages are demanding devices be designed to have rights for all languages that have ever existed and computer speed AI Can do the mammoth tasks of collecting data words to develop computer terms in those languages and users can edit their own words like now we can name app titles folder text names in Ainu A I and quantum can do this HUAWEI is doing this they see many people in thud world have no access to AI and power Wifi cell phones and China is going to get into this market because west just wants to have it for them selves they want to choose their language China will be able to serve this because Chinese is spoken by many and they are satisfied and they never impose Chinese language onto others but CHINESE HANJI 10,000 YEARS OLD CAN BE USED IN CODE AND MODERN COMPUTER AI it is a writing system designed for computing as though it they knew of computers 10,000 years ago.
Apple had better serve the world in their language before someone else gets there there is Ainu icon on keyboard preference but useless because it renders roman into katakana and it destroys Ainu I am not saying that latin is the best until we develop a a script to fit with Ainu voice AI can do that by phonetic tactic voice recognition app Siri can not even teach me Finnish when I ask in Finnish !
I am having the similar issue with changing with my new MBP M2. Computer charged a couple of times and now on fourth battery cycle, charged from approx <25% to 50% then quit charging. Blinking amber LED on cable USB-C to MagSafe T connection with 140W power adapter. Running Sonoma 14.2.1. First three battery cycles, all charged to 100%.
I will update this post with more information once I test adapter, cable and different charging scheme (to test computer)...
Hi,
We want to connect the iPhone 15 to Accessory Test System (ATS) for traffic capture. The iPhone 15 is a USB-C iOS device. In ATS User Guide, the iAP-over-USB (USB-C) setup mentions:
For iAP-over-USB accessories which can connect to USB-C iOS devices, you only need to connect a Beagle USB 480 analyzer.
The Beagle USB 480 analyzer USB-A port should be connected to the USB device, the USB-B port should be connected to the USB host. Connect the equipment as shown.
We use the following setup with Beagle USB 480 analyzer:
[Beagle: USB-B port] <-- USB-B-to-TypeC cable --> [iPhone]
[Beagle: USB-A port] <-- USB-A-to-USB-A cable --> [Accessory]
But, the Beagle USB Analyzer in the ATS app cannot find the iPhone attached.
Is the above setup correct? How to connect the iPhone 15 to the Beagle USB 480 analyzer?
Does anyone know the iAP-over-USB (USB-C) setup? Thanks for help.