I need to install the GoogleSignIn for my iOS application, and I did so.
The issue is that the watchOS app inside the project stops working because of build failures that are caused by the GoogleSignIn package. I cannot figure how to fix it, because inside the project settings, that package appears in the list of Frameworks for the iOS target, and is not present into the list of Frameworks for the watchOS target, as you can see in the first image below.
Post
Replies
Boosts
Views
Activity
I've had a little personal utility running for several versions of macOS that uses
let client = CWWiFiClient.shared()
if let ssid_name = client.interface()?.ssid()
to get the current SSID name and prints it (along with a bunch of other active network details.
With the most recent Sonoma Beta 2 and Xcode beta 2, this always returns nil.
Doing the same thing in a playground works as expected.
Is this a purposeful change or a bug I should file?
We are working on a new iOS application utilizing the new iOS 17 APIs, and I have updated Xcode to Xcode 15 Beta, and my iPhone 12 Pro to iOS 17 Beta 2, though this issue was also present on iOS 17 Beta 1.
In Xcode, for "Signing and Capabilities" I have my Team set to my personal team, utilizing the "Automatically manage signing" tick.
While the app will build and install on my phone, I immediately receive this error, with no popup to trust the developer.
Going to Settings > General > VPN and Device Management, I can see my Development Team, and I am able to Trust my team.
When trying to then Verify App(s), it tells me it will use my internet connection to verify the application.
However, it will then do nothing, with no error, regardless of how many times I attempt to verify.
Trying to open the app from my home screen will result in the repeated "Unable to Verify Error".
Trying to reset network settings does not result in any change in this behavior, nor does a reset of the phone.
I have tried 4 different high quality WiFi networks, as well as a fully connection AT&T cellular LTE connection, and still receive this error.
I am running out of diagnostic scenarios, and I'm curious if anyone has found a resolution to this?
Getting this error several times when presenting a modal window over my splitview window when running it on my Mac using Swift/Mac Catalyst in XCode 14.2. When I click the Cancel button in the window then I get Scene destruction request failed with error: (null) right after an unwind segue.
2023-07-04 16:50:45.488538-0500 Recipes[27836:1295134] [WindowHosting] UIScene property of UINSSceneViewController was accessed before it was set.
2023-07-04 16:50:45.488972-0500 Recipes[27836:1295134] [WindowHosting] UIScene property of UINSSceneViewController was accessed before it was set.
2023-07-04 16:50:45.496702-0500 Recipes[27836:1295134] [WindowHosting] UIScene property of UINSSceneViewController was accessed before it was set.
2023-07-04 16:50:45.496800-0500 Recipes[27836:1295134] [WindowHosting] UIScene property of UINSSceneViewController was accessed before it was set.
2023-07-04 16:50:45.994147-0500 Recipes[27836:1295134] Unbalanced calls to begin/end appearance transitions for <UINavigationController: 0x7f7fdf068a00>.
bleep
2023-07-04 16:51:00.655233-0500 Recipes[27836:1297298] Scene destruction request failed with error: (null)
I don't quite understand what all all this means. (The "bleep" was a debugging print code I put in the unwind segue). I'm working through Apple's Mac Catalyst tutorial but it seems to be riddled with bugs and coding issues, even in the final part of the completed app which I dowmloaded and ran. I don't see these problems on IPad simulator.
I don't know if it's because Catalyst has problems itself or there's something else going on that I can fix myself. Any insight into these errors would be very much appreciated!
PS: The app seems to run ok on Mac without crashing despite the muliple issues
Hi,
Is there any reasonable way to use MusicKit on the Vision Pro simulator. I have been unable to get anything working and I was wondering if the situation is the same as iOS etc where you need a physical device to test the app.
I implemented Sign in with Apple but in all cases the button is always black. I would like to show it in light/ dark mode depending on the phone settings.
This is my code:
class MyAuthorizationAppleIDButton: UIButton {
private var authorizationButton: ASAuthorizationAppleIDButton!
@IBInspectable
var cornerRadius: CGFloat = 3.0
@IBInspectable
var authButtonType: Int = ASAuthorizationAppleIDButton.ButtonType.default.rawValue
@IBInspectable
var authButtonStyle: Int = ASAuthorizationAppleIDButton.Style.black.rawValue
override public init(frame: CGRect) {
super.init(frame: frame)
}
required public init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
override public func draw(_ rect: CGRect) {
super.draw(rect)
// Create ASAuthorizationAppleIDButton
authorizationButton = ASAuthorizationAppleIDButton(authorizationButtonType: .signIn, authorizationButtonStyle: .black)
let type = ASAuthorizationAppleIDButton.ButtonType.init(rawValue: authButtonType) ?? .default
let style = ASAuthorizationAppleIDButton.Style.init(rawValue: authButtonStyle) ?? .black
authorizationButton = ASAuthorizationAppleIDButton(authorizationButtonType: type, authorizationButtonStyle: style)
authorizationButton.cornerRadius = cornerRadius
// Show authorizationButton
addSubview(authorizationButton)
// Use auto layout to make authorizationButton follow the MyAuthorizationAppleIDButton's dimension
authorizationButton.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
authorizationButton.topAnchor.constraint(equalTo: self.topAnchor, constant: 0.0),
authorizationButton.leadingAnchor.constraint(equalTo: self.leadingAnchor, constant: 0.0),
authorizationButton.trailingAnchor.constraint(equalTo: self.trailingAnchor, constant: 0.0),
authorizationButton.bottomAnchor.constraint(equalTo: self.bottomAnchor, constant: 0.0),
])
}
}
So basically with the code above, I can set on Storyboard the style of the button but it seems that even if I change the value at my code, the result is based on what I chose on Storyboard's variable.
Is there any solution where I would be able to show the button in light/ dark mode depending on the phone settings ?
It is stated that
From Fall 2023 you’ll receive an email from Apple if you upload an app to App Store Connect that uses required reason API without describing the reason in its privacy manifest file. From Spring 2024, apps that don’t describe their use of required reason API in their privacy manifest file won’t be accepted by App Store Connect.
There are some answers here : https://developer.apple.com/videos/play/wwdc2023/10060/ but far from answering all questions.
I have questions on how to implement:
Where exactly is the privacy manifest ? How to create it, from which file template in Xcode ? WWDC speaks of a PrivacyInfo.xcprivacy (does it require a more recent version of Xcode than 14.2).
WWDC describes a framework case. Is it the same for a "final" app ?
is there a specific format for describing the reason ? Or just plain text.
Is this text visible to the user or only to reviewer ?
does it apply retroactively to apps already in AppStore (do they need to be resubmitted ?). It seems not.
So I tried, in an iOS App, to declare the PrivacyInfo.xcprivacy as explained, with Xcode 14.2, using plist template, to no avail.
Really not clear on how to proceed or even start… We would need a clear step by step tutorial with all prerequisites (Xcode or MacOS versions needed for instance).
Please submit a bug report (https://swift.org/contributing/#reporting-bugs) and include the crash backtrace.
Stack dump:
0. Program arguments: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/swift-frontend -frontend -c /Users/lawrey/Documents/GitHub/IPPTPLUS-IOS/IPPT/Common/Toast/Toast.swift /Users/lawrey/Documents/GitHub/IPPTPLUS-IOS/IPPT/Shop/CheckOutViewController.swift /Users/lawrey/Documents/GitHub/IPPTPLUS-IOS/IPPT/Common/UITextField/NLPaddedTextFieldView.swift
...
...
/IPPT.build/Release-iphoneos/IPPT.build/Objects-normal/arm64/SettingsVC.o -index-unit-output-path /IPPT.build/Release-iphoneos/IPPT.build/Objects-normal/arm64/DGElasticPullToRefreshConstants.o -index-unit-output-path /IPPT.build/Release-iphoneos/IPPT.build/Objects-normal/arm64/DGElasticPullToRefreshView.o -index-unit-output-path /IPPT.build/Release-iphoneos/IPPT.build/Objects-normal/arm64/CountDownRouterVC.o -index-unit-output-path /IPPT.build/Release-iphoneos/IPPT.build/Objects-normal/arm64/DatePickerTableViewCell.o -index-unit-output-path /IPPT.build/Release-iphoneos/IPPT.build/Objects-normal/arm64/CircularProgress.o
1. Apple Swift version 5.8 (swiftlang-5.8.0.124.2 clang-1403.0.22.11.100)
2. Compiling with the current language version
3. While evaluating request ExecuteSILPipelineRequest(Run pipelines { PrepareOptimizationPasses, EarlyModulePasses, HighLevel,Function+EarlyLoopOpt, HighLevel,Module+StackPromote, MidLevel,Function, ClosureSpecialize, LowLevel,Function, LateLoopOpt, SIL Debug Info Generator } on SIL for IPPT)
4. While running pass #670343 SILModuleTransform "DeadFunctionAndGlobalElimination".
Stack dump without symbol names (ensure you have llvm-symbolizer in your PATH or set the environment var `LLVM_SYMBOLIZER_PATH` to point to it):
0 swift-frontend 0x0000000109b7f300 llvm::sys::PrintStackTrace(llvm::raw_ostream&, int) + 56
1 swift-frontend 0x0000000109b7e2e4 llvm::sys::RunSignalHandlers() + 112
2 swift-frontend 0x0000000109b7f910 SignalHandler(int) + 344
3 libsystem_platform.dylib 0x000000019418aa24 _sigtramp + 56
4 swift-frontend 0x00000001052bb65c (anonymous namespace)::DeadFunctionAndGlobalEliminationPass::run() + 424
5 swift-frontend 0x00000001052bb65c (anonymous namespace)::DeadFunctionAndGlobalEliminationPass::run() + 424
6 swift-frontend 0x00000001054214a0 swift::SILPassManager::executePassPipelinePlan(swift::SILPassPipelinePlan const&) + 15312
7 swift-frontend 0x0000000105442a04 swift::SimpleRequest<swift::ExecuteSILPipelineRequest, std::__1::tuple<> (swift::SILPipelineExecutionDescriptor), (swift::RequestFlags)1>::evaluateRequest(swift::ExecuteSILPipelineRequest const&, swift::Evaluator&) + 56
8 swift-frontend 0x0000000105428dd4 llvm::Expected<swift::ExecuteSILPipelineRequest::OutputType> swift::Evaluator::getResultUncached<swift::ExecuteSILPipelineRequest>(swift::ExecuteSILPipelineRequest const&) + 484
9 swift-frontend 0x000000010542b714 swift::runSILOptimizationPasses(swift::SILModule&) + 400
10 swift-frontend 0x0000000104c09c10 swift::CompilerInstance::performSILProcessing(swift::SILModule*) + 524
11 swift-frontend 0x0000000104a6d51c performCompileStepsPostSILGen(swift::CompilerInstance&, std::__1::unique_ptr<swift::SILModule, std::__1::default_delete<swift::SILModule> >, llvm::PointerUnion<swift::ModuleDecl*, swift::SourceFile*>, swift::PrimarySpecificPaths const&, int&, swift::FrontendObserver*) + 1040
12 swift-frontend 0x0000000104a70ab8 performCompile(swift::CompilerInstance&, int&, swift::FrontendObserver*) + 3288
13 swift-frontend 0x0000000104a6e944 swift::performFrontend(llvm::ArrayRef<char const*>, char const*, void*, swift::FrontendObserver*) + 4308
14 swift-frontend 0x0000000104a3368c swift::mainEntry(int, char const**) + 4116
15 dyld 0x0000000193e03f28 start + 2236
Command SwiftCompile failed with a nonzero exit code
I need help understanding the root cause to resolve. Overwhelmed by the log :(
Hello, I am trying to debug Swift Concurrency codes by using Swift Concurrency Instruments.
But in our project which has been maintained for a long time, Swift Concurrency Instruments seems not working.
Task {
print("async code")
await someAsyncFunction()
}
When I run Swift Concurrency Instruments with the above codes in our project, I could check that the print works well by checking stdout/stderr Instruments, but there are no records on Swift Concurrency Instruments.
But in the small simple sample project, I checked that Swift Concurrency Instruments works well.
Are there any settings that I need to do for the legacy project to debug Swift Concurrency?
Recently after updating iPad devices on 16.6, our devices is giving following error on all of our enterprise applications.
Unable to verify app. An internet connection is required to verify the trust of the developer iOS version of iPad devices -> 16.6
Xcode version -> 14.2(14C18)
Certificates and provisioning profiles are valid, we have cross checked that and it is causing only on few of our devices.
Below are the screen shots attached for info:
Is there any way to use metal-cpp in a Swift project? I have a platform layer I've written in Swift that handles Window/View creation, as well as event handling, etc. I've been trying to bridge this layer with my C++ layer as you normally would using a pure C interface, but using Metal instances that cross this boundary just doesn't seem to work.
e.g. Currently I initialize a CAMetalLayer for my NSView, setting that as the layer for the view. I've tried passing this Metal layer into my C++ code via a void* pointer through a C interface, and then casting it to a CA::MetalView to be used. When this didn't work, I tried creating the CA::MetalLayer in C++ and passing that back to the Swift layer as a void* pointer, then binding it to a CAMetalLayer type. And of course, this didn't work either.
So are the options for metal-cpp to use either Objective-C or just pure C++ (using AppKit.hpp)? Or am I missing something for how to integrate with Swift?
There is something that I'm spending hours trying to solve but start to be blocked.
On one of my app projects, I'm experiencing random fatal errors during the access and update of SwiftData Models, which does not occur when using CoreData.
This mainly happens when setting value for newly created entities: SwiftData randomly crashes the application with the following error: EXC_BAD_ACCESS (code=1, address=0x11).
I made sure to update the code to move operations in the background and to use a ModelActor. It works properly, but 1 out of 3 times, it crashes with the error quoted.
I also tried to look at previous posts and made sure to persist the context when creating new entities, before updating them.
The last information in my console before “(lldb)” is:
CoreData: debug: CoreData+CloudKit: -[NSCloudKitMirroringDelegate managedObjectContextSaved:](2996): &lt;NSCloudKitMirroringDelegate: 0x2806342a0&gt;: Observed context save: &lt;NSPersistentStoreCoordinator: 0x281434c40&gt; - &lt;NSManagedObjectContext: 0x280450820&gt;
CoreData: debug: CoreData+CloudKit: -[NSCloudKitMirroringDelegate remoteStoreDidChange:](3039): &lt;NSCloudKitMirroringDelegate: 0x2806342a0&gt;: Observed remote store notification: &lt;NSPersistentStoreCoordinator: 0x281434c40&gt; - FF2D0015-7121-4C30-9EE3-2A51A76C303B - &lt;NSPersistentHistoryToken - {
"FF2D0015-7121-4C30-9EE3-2A51A76C303B" = 1316;
}&gt; - file:///private/var/mobile/Containers/Shared/AppGroup/ED0B229A-F5BC-47B7-B7BC-88EEFB6E6CA8/Library/Application%20Support/default.store
CoreData: debug: CoreData+CloudKit: -[NSCloudKitMirroringDelegate managedObjectContextSaved:](2996): &lt;NSCloudKitMirroringDelegate: 0x2806342a0&gt;: Observed context save: &lt;NSPersistentStoreCoordinator: 0x281434c40&gt; - &lt;NSManagedObjectContext: 0x280450820&gt;.
In the rare occasion where it provides more details, Xcode shows issue in the @Model macro, as shown in the screenshot attached.
I was able to reproduce this issue on 4 different physical devices and in the simulator.
It also happens, less frequently, when trying to fetch a Model entity from the SwiftData context. Not for specific models tho, it can happen for all of them, randomly.
I was able to experience this random problem both on iOS 17.0 Developer Beta, RC &amp; the official iOS 17.0 release. At the time in the beta Xcode 15, and now the latest stable Xcode.
Am I the only one having this kind of issues? Or is it a known issue with Swift?
Anybody has idea how I could solve that?
The Problem: The App Store listing of my app does not show all languages the app is localised in:
The Question: How to fix this? A potential reason I could imagine might be, that the App Store looks for the files localised, and Xcode shows 0 localised files in English for my app, because English is my development language and all Strings are in English like that:
Text("Reload", comment: "Menu action to reload feed")
I really don't want to translate every English word, to English like "Reload" = "Reload". What am I getting wrong here? How can I have the App Store show both supported languages?
The current implementation:
My app supports two languages
and I am using the new String catalog for localising.
Both languages work while testing on device and are available from the App Settings in the System settings app to switch between:
Hi, I'm developing an app like Opal which uses Screen Time API. So the user should be able to create modes to activate it later, like an alarm. There's a list of durations and mode names and there should be selected apps, so whenever the user turns the switch on, app restricting should be started. How can I save the selected apps in FirebaseFirestore, and then get that data to restrict the apps whenever the user turns the switch on? I've tried to save it as Strings, but I can't convert it later to ApplicationToken to insert it to Set.
Hi,
Is there math functions in Swift built in, such as Sin, Cos, Log etc ?
Kindest Regards
I am working on an app where I need to orient a custom view depending on the device heading. I am using ARKit and ARSCNView with the ARGeoTrackingConfiguration in order to overlay my custom view in real world geographic coordinates. I've got a lot of it working, but the heading of my custom view is off.
Once the ARSession gets a ARGeoTrackingStatus.State of .localized, I need to be able to get the devices heading (0-360) so that I can orient my view. I'm having trouble figuring out how to do this missing piece. Any help is appreciated.
Xcode 15 and Sonoma expose the property NSView.clipsToBounds. On Sonoma I need to set this property to TRUE and this is working well.
However, for reasons, I also need to compile my macOS project with Xcode 13 and Xcode 14. I am having difficulty figuring out the Swift #available and @available directives to allow this.
No matter how I wrap the self.clipsToBounds = true statement, on these earlier versions of Xcode I get a Swift error telling me that clipsToBounds cannot be found.
Any help would be appreciated.
The new alert on sonoma when you have 8 > actions its shows horizontal and goes out of the screen.
Is there any way to bring back the old alert?
let alert = UIAlertController(title: "Alert", message: "Message", preferredStyle: UIAlertController.Style.alert)
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.destructive, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.default, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.default, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.destructive, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.default, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.default, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.destructive, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.default, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.default, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.default, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.destructive, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.default, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.default, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.destructive, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.default, handler: nil))
alert.addAction(UIAlertAction(title: "Click", style: UIAlertAction.Style.default, handler: nil))
self.present(alert, animated: true, completion: nil)
I m trying to create an xcode project with cpp-swift interoperability(introduced in xcode15) using cmake xcode generator. I m able to invoke cpp code in swift and vice-versa but when opening the project in xcode, the build setting for 'C++ and Objective-C Interoperability' is still set to 'C/Objective C' like in image below. I want to set it to 'C++/Objective-C++'.
I m using the below cmake for this:
.
.
add_library(cxx-support ./Sources/CxxSupport/Student1.cpp
./Sources/CxxSupport/Teacher.swift
)
#include the directory to access modulemap content
target_include_directories(cxx-support PUBLIC
${CMAKE_SOURCE_DIR}/Sources/CxxSupport)
target_compile_options(cxx-support PUBLIC
"$<$<COMPILE_LANGUAGE:Swift>:-cxx-interoperability-mode=default>"
)
.
.
I Have also tried the below in cmake, but it didn't work.
#set(SWIFT_OBJC_INTEROP_MODE "objcxx" CACHE STRING "")
#target_compile_options(cxx-support PUBLIC
#"SWIFT_OBJC_INTEROP_MODE=objcxx")
Any help on how this can be achieved?
I am trying to implement Hand Tracking in a visionOS application I am developing. Initially, I encountered errors such as "Cannot find type 'ARAnchor' in scope." Upon investigation, I realized the error arose because "import ARKit" is required, which was already in the code.
Digging a bit deeper, I discovered that ARKit is not compatible with the simulator, or at least that's what I believe. My ultimate understanding is: "ARKit requires the specific hardware of an actual iOS device to operate, as it relies on features like the device's camera and motion sensors."
Having reached this point, I noticed in the "HappyBeam" application documentation for visionOS there was a function "#if !targetEnvironment(simulator)". Upon researching, it turns out that this compilation conditional is used so that the application can compile in the simulator (without ARKit functionality) and on a real device.
After applying "#if !targetEnvironment(simulator)" it allowed me to compile, but now when I run the application in Simulator, I receive a message on the main window stating: "AR not supported in Simulator."
My current question is: Can I simulate Hand Tracking in any way without using ARKit, or is it essential to run the application on a physical device?
And if Hand Tracking can be simulated in "Simulator", how is it done? I haven't been able to find a way to do it.