Hi everyone,
I have a Macbook Air M1 and the Mac runs fine.
For work needs, I need to use the new Sonoma operating system on an external drive.
Now here are the problems.
The operating system is installed correctly so I don't think that could be the problem.
It happens that if I attach the external disk to the Belkin HUB, the new operating system starts without problems, if instead I try to attach the SSD to the Macbook Air USB ports, the operating system does not start, so I am forced to use Sonoma only with the hub attached.
I thought it could reset the PRAM or the SMC but from the Apple forum, it seems that Macbooks with M1 processors, have neither PRAM nor SMC.
Is there anything I can do ?
Thank you all,
Vincent
Core OS
RSS for tagExplore the core architecture of the operating system, including the kernel, memory management, and process scheduling.
Post
Replies
Boosts
Views
Activity
Could not set environment: 150: Operation not permitted while System Integrity Protection is engaged
Hi.
After the recent update of Ventura to 13.4 and Big Sur to 11.7.7
all of a sudden "launchctl setenv" returns the following errors.
Could not set environment: 150: Operation not permitted while System Integrity Protection is engaged
Is there any workaround to fix this?
I'm trying to do the Happy Beam demo. I was wondering if I could use my iPhone to handle hand tracking?
I have 3 immersive spaces, and I'm trying to "jump" between them. Whenever I go from a space to the next one I try to dismiss the current one by executing await dismissImmersiveSpace() and right after await openImmersiveSpace(value: id). This is being performed inside of a Task, run on the click of a button.
It seems like dismissImmersiveSpace is released before the actual space has been completely removed, as the next immersive space does not open.
On the other hand, I added a manual waiting time between dismissing an immersive space and showing the next one, and everything seems to be working fine, which is why I suspect that this is a lifecycle issue of the dismissImmersiveSpace.
That being said, is there any way to listen to the actual state of the dismissed immersive space, so that I know when I can present the next one?
Is there any way around this without having to introduce a manual delay?
We activate our camera extension from host application and wait for user to allow access it in System Settings. Once our host application receives notification camera extension is ready to be used we want to communicate with the extension.
When we enumerate AVCaptureDevices or try to find newly added device using CMIOObjectGetPropertyData for property kCMIOHardwarePropertyDevices, our camera extension is not shown. Once we stop and restart host application camera extension is shown as expected, issue only happens once right after activating the extension.
Looks like capture devices are not refreshed for host application after camera extension is activated and approved. Is there a way to force system to refresh cameras? Or any other ideas to make extension immediately visible for host application without relaunching it?
I'm working on an app that has the following structure:
MyApp
MyWidgetExtension
MyWatchKitApp
-- MyWatchKitAppExtension
---- MyWatchKitAppWidgetExtension
Both MyWidgetExtension and MyWatchKitAppWidgetExtension were developed using a shared MyWidgetBundle defined as follows:
@main
struct MyWidgetBundle : WidgetBundle
However, I'm running into an issue when attempting to run this on devices with iOS 14. I get an error stating "App extensions must define either NSExtensionMainStoryboard or NSExtensionPrincipalClass keys in the NSExtension dictionary in their Info.plist."
Interestingly, if I remove MyWatchKitAppWidgetExtension, the app installs just fine. But, if I add NSExtensionPrincipalClass or NSExtensionMainStoryboard, when I try to distribute the app to TestFlight, I receive an error stating "Unexpected key NSExtensionPrincipalClass found in extension Info.plist".
I'm at a loss as to how to resolve this issue. Does anyone have any suggestions or insights?
I'm getting a weird crash from the template VisionOS app (the one that appears upon creating a new VisionOS project). I went and modified the code to use two additional C++ files, one of which throws an exception upon a specific circumstance, and another one that catches and handles the exception.
If I attempt to run the app with the new code, instead of the code catching the exception, I get a SIGABRT signal, as if C++ exceptions were not enabled at project level.
The following gist contains a minimal example with the weird behavior:
https://gist.github.com/Izhido/100a92f45aaf8bacffe73893d6109077
Replace the contents of the template VisionOS project with these files, run the project, and press the "Do sum" button. Xcode 15 beta will report a SIGABRT signal at sumofnumbers_impl.cpp, line 8.
What am I missing here?
(Incidentally, the same code in a MacOS project runs just fine - I can share the project upon request. Also, for some reason I cannot share screenshots or files in this forum - that's why I provided the gist.)
I would like to make the HoverEffectComponent visibly react as it is being shown in "Build spatial experiences with RealityKit" at around 19:50:
https://developer.apple.com/videos/play/wwdc2023/10080
Is this feature supported yet?
FB12755685
Sent a feedback through feedback assistant and wanted to elaborate more over here.
Our application configures and connects to wireless networks using CoreWLAN. We started seeing crashes while connecting with the latest Beta versions of Sonoma. The crashes showed EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0) which led me to believe it might be a CPU architecture issue due to the mention of i386 but that was completely wrong.
Turns out the error is due to:
*** CFRetain() called with NULL ***
CFRetain.cold.1
[CWInterface associateToEnterpriseNetwork:identity:username:password:error:] + 127
Sample code to consistently reproduce the crash on Sonoma
CWInterface* interface = [CWInterface interfaceWithName:@“en0”];
NSError *scanError = nil;
NSSet* testNetworks = [interface scanForNetworksWithName:@“SSIDName” error:&testScanError];
CWNetwork* network = [testNetworks anyObject];
NSError* connectionError = nil;
BOOL connected = [interface associateToEnterpriseNetwork:network identity:identityRef username:nil password:nil error:&connectionError]; //<--crash here
The associateToEnterpriseNetwork function expects a CWNetwork object. In the Beta versions of macOS Sonoma, the CWNetwork object has (null) values in the ssid field. This causes a cold CFRetain runtime error when trying to associate to the network.
If we can detect a broadcasting "SSIDName" SSID and try to associate to it, we will always crash on the last line.
The reason for the crash seems to be due to a difference in how the CWNetwork objects are handled in Sonoma.
Sonoma:
<CWNetwork: 0x6000036cb590> [ssid=(null), bssid=(null), security=WPA2 Enterprise, rssi=-53, channel=<CWChannel: 0x6000036fce90> [channelNumber=1(2GHz), channelWidth={20MHz}], ibss=0]
Ventura:
<CWNetwork: 0x6000010ffa60> [ssid=Chris640, bssid=(null), security=WPA2 Enterprise, rssi=-45, channel=<CWChannel: 0x6000010ffca0> [channelNumber=1(2GHz), channelWidth={20MHz}], ibss=0]
Is my assumption that the crash is due to the (null) in the SSID field correct?
In the session around 19:15, the ornament for the app is displayed with fade animation. How can I achieve this? I can control the visibility using
ornament(visibility:attachmentAnchor:contentAlignment:ornament:)
with .visible or .hidden, but no animation.
Adding .transition to the content view also does have no effect.
In my app I get a UIImage for a PHAsset via PHImageManager.requestImage(for:targetSize:contentMode:options:resultHandler:). I directly display that image in a UIImageView that has preferredImageDynamicRange set to .high. The problem is I do not see the high dynamic range.
I see the HDRDemo23 sample code uses PhotosPicker to get a UIImage from Data through UIImageReader whose config enables prefersHighDynamicRange.
Is there a way to support HDR when using the Photos APIs to request display images?
And is there support for PHLivePhoto displayed in PHLivePhotoView retrieved via PHImageManager.requestLivePhoto?
To be able to paire a matter device on ios I need to installed on my iPhone the matter client developper profile as indicate on the apple documentation
Adding Matter support to your ecosystem | Apple Developer Documentation
(To test your app on the iOS or macOS device that initiates the pairing, download the developer profile now, then install it.)
When I do that it works perfectly.
Otherwise the documentation says that the profile is only needed for development but when I want to use my app from the apple store (validate by Apple) and when I remove the profile it doesn't work anymore.
What do I have to do to paire Matter device on iphone without the Matter client developer profile.
What is the correct way to increase sysv shared memory limits which survives a system reboot? Currently we create this file:
/Library/LaunchDaemons/com.gemtalksystems.shared-memory.plist
`
File attributes:
normg@oink>xattr -l com.gemtalksystems.shared-memory.plist com.apple.provenance:
This used to work but no longer does.
Now I have to manually execute:
sysctl kern.sysv.shmmax=12884901888
after reboot to increase the limits which is not ideal. Is there a better way?
System info:
normg@oink>sw_vers ProductName: macOS ProductVersion: 13.5 BuildVersion: 22G74 /Library/LaunchDaemons
normg@oink>uname -a Darwin oink.gemtalksystems.com 22.6.0 Darwin Kernel Version 22.6.0: Wed Jul 5 22:17:35 PDT 2023; root:xnu-8796.141.3~6/RELEASE_ARM64_T8112 arm64
Hello!
I tried to upload my app to AppStoreConnect with Xcode.
But, when I try to distribute my app and uploading to AppStoreConnect, Xcode crash and terminate with below log
Translated Report (Full Report Below)
Process: Xcode [3748]
Path: /Applications/Xcode-beta.app/Contents/MacOS/Xcode
Identifier: com.apple.dt.Xcode
Version: 15.0 (22237.2)
Build Info: IDEFrameworks-22237002000000000~5 (15A5209g)
Code Type: ARM-64 (Native)
Parent Process: launchd [1]
User ID: 501
Date/Time: 2023-08-05 20:59:13.8571 +0900
OS Version: macOS 14.0 (23A5301h)
Report Version: 12
Anonymous UUID: 7A1F0101-7EFB-9A4F-78F1-6B0EC0B15400
Time Awake Since Boot: 2100 seconds
System Integrity Protection: enabled
Crashed Thread: 0 Dispatch queue: archive info plist lock
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: Namespace SIGNAL, Code 6 Abort trap: 6
Terminating Process: Xcode [3748]
Application Specific Information:
com.apple.main-thread
abort() called
I tried...
Reboot my Mac
change my Bundle Identifier
how can I solve this problem? plz. help
Hi.
I'm trying to develop a passkey app connected with a Webauthn server.
There is a problem in the process of creating the Attestation Object.
Since I am assigned 8445 port, I need to upload the /.well-known/ directory to that port.
In the WebAuthn specification, the RPID should not include the port number.
(https://www.w3.org/TR/webauthn-2/#relying-party-identifier)
When initializing an ASAuthorizationPlatformPublicKeyCredentialProvider object, if I add the port number to RPID, the passkey UI will work and ASAuthorization object will be checked.
But, I don't get authentication from webauthn server because the Attestation Obj is generated with RPID which contains port.
Is there any way to specify the port number to check the "well-known" directory? Or is it only possible on port 443?
I checked the post, but there is no more feedback, so I'm asking a question.
(https://developer.apple.com/forums/thread/730028)
Hi! I'm working on an iOS Safari extension that has a ServiceWorker. Lately we've noticed that this ServiceWorker seems to get killed seemingly at random, and there are no logs or crash reports to tell us what happened.
I'm hypothesizing that iOS might be shutting down Safari ServiceWorkers when the ProcessInfo.thermalState approaches .serious. I have circumstantial evidence that our ServiceWorker tends to get killed more often at higher levels of thermalState but can't yet say conclusively that this is the case. I can't find any direct evidence of this on internet searches either.
Is anyone able to shed light onto this topic? The specific symptoms are:
ServiceWorker stops, and the menu entry for its console window no longer appears on macOS Safari.
No crash logs via Xcode or Sentry, and no Console messages as far as we could tell (caveat: MobileSafari generates a LOT of messages! We might have missed it.)
If attached via debugger, the native part of our extension just disappears and the debugger loses connection with no error message.
ServiceWorker no longer works for the lifetime of the Safari process. Sometimes, when we kill Safari and restart, we can get the ServiceWorker back. This usually requires toggling our extension's "enabled" state in system settings.
In some cases, even killing/relaunching Safari and toggling the system setting doesn't bring our ServiceWorker back. I'm hypothesizing right now that this happens when the thermal state is high.
I've tried simulating a serious/critical thermal state in the Xcode Devices window, but couldn't repro the ServiceWorker problem. I don't know if that setting affects the whole system, though, or just our own apps.
Help appreciated!
Yuna
Facing crashes with a deamon that I'm working with has info.plist embedded within it. It also has CFBundleVersion, CFBundleShortVersionString properties with the appropriate values. But somehow the macOS is not picking up the version while generating crash reports.
Crash reports just shows ??? in the version field.
Here is info plist within the binary,
I'm using macOS Monterey 12.2.1
I have updated my iPhone 14 pro max to the iOS 17 beta 4 and since then I can't connect anymore to our wifi network(s). The second device with the same iOS does it well. The faulty device, takes a while before opening the wifi-settings screen, and doesn't show any wifi network. Sometimes it shows mobile devices nearby as hotspots. But even to those I can't connect. In the console you see a kind of endless loop of the wifid process. After a while the device reboots by him selves and as long as you don't do any action on the wifi settings, he remains working. It is true that the device is very slow in case of network action.
I did already a network reset, updated in the meanwhile to beta 5. Again a network reset, but the issue remains.
Tell me what you need to have as logging to be able to back trace the issue.
I don't really have the intention to do a reset of the device as it is a beta device. And I want to see if the issue can be solved by updates or manual intervention on some settings.
Another strange behaviour is the fact that when I go into the list of my known networks, it is still filled up, even after a network reset. And when you manually delete an entry it remains. Which makes me think that it could be related to the keychain.
If this last is the case, how can I check this.
I want to draw a mesh generated by SceneReconstructionProvider with a material set, but how should I draw the ShapeResource?
I want to give a material to the mesh of the recognized wall or object to direct it.
print("Part of the scene has been updated: ", update.anchor)
task(priority: .low) {
let shape = try await ShapeResource.generateStaticMesh(from: update.anchor)
let entity = Entity()
// What do I need to add?
entity.components[CollisionComponent.self] = .init(shapes: [shape])
entity.components[PhysicsBodyComponent.self] = .init(
massProperties: .default, .
material: nil, ,
mode: .static)
}
```
After installed iOS 17 beta 6 image on iPhone 14 Pro Max, the "Developer Mode" disappeared from system settings. Now Xcode doesn't recognize my device because Developer Mode is off. Is there anyway to turn it on?
The previous iOS 17 beta 3 image works fine.