Explore the power of machine learning within apps. Discuss integrating machine learning features, share best practices, and explore the possibilities for your app.

Post

Replies

Boosts

Views

Activity

Handling App Intents Behind Authentication/Paywall
My App has several resources that I'd like to spring open through App Intents. For example a series of Dictionaries. These resources however in the app are behind a log in (for security) and are entitlements that are purchased. They may own 4 of 7 dictionaries. If I want to have an intent that says, "Open Dictionary: (Dict Name)" how do I best handle situations where the user may no longer be logged in or have the entitlement for that specific dictionary? Thanks
1
0
452
Jun ’24
Shortcuts App Intent Only for Active Subscribers
I have a Shortcuts action via an App Intent that I want only for active subscribers to use. I have a shared class that handles all the subcription related things. But for some reason my code only works if the app is active in the background. Once the app is quitted and the user performs the Shortcut, the not subscribed error is thrown – even though the user is subscribed. How can I ensure that my subscription check is done correctly, if the app isn’t open in the background? My Code App Intent excerpt: @MainActor func perform() async throws -> some IntentResult & ReturnsValue<MeterIntentEntity> { // Validate that the user is subscribed. // Cancels action with error message if not subscribed. if SubscriptionManager.shared.userIsSubscribed == false { throw IntentError.notSubscribed } // More Code … // Finish and pass created value as result. return .result(value: something) } Subscription Manager excerpt: class SubscriptionManager: ObservableObject { // A singleton for our entire app to use static let shared = SubscriptionManager() let productIds = ["my_sub1", "my_sub2"] @Published private(set) var availableSubscriptions: [Product] @Published private(set) var purchasedSubscriptions: [Product] = [] public var userIsSubscribed: Bool { return !self.purchasedSubscriptions.isEmpty } init() { // Initialize empty products, and then do a product request asynchronously to fill them in. availableSubscriptions = [] Task { await updatePurchasedProducts() } } @MainActor func updatePurchasedProducts() async { for await result in Transaction.currentEntitlements { do { let transaction = try checkVerified(result) if let subscription = availableSubscriptions.first(where: { $0.id == transaction.productID }) { purchasedSubscriptions.append(subscription) } } catch { Logger.subscription.error("Error loading users user's purchased products.") } } }
1
0
567
Jun ’24
Hundreds of AI models mining and indexing data on MAC OS.
Hi, this is the 3rd time I'm trying to post this on the forum, apple moderators ignoring it. I'm a deep learning expert with a specialization of image processing. I want to know why I have hundreds of AI models on my Mac that are indexing everything on my computer while it is idle, using programs like neuralhash that I can't find any information about. I can understand if they are being used to enhance the user experience on Spotlight, Siri, Photos, and other applications, but I couldn't find the necessary information on the web. Usually, (spyware) software like this uses them to classify files in an X/Y coordinate system. This feels like a more advanced version of stuxnet. find / -type f -name "*.weights" > ai_models.txt find / -type f -name "*labels*.txt" > ai_model_labels.txt Some of the classes from the files; file_name: SCL_v0.3.1_9c7zcipfrc_558001-labels-v3.txt document_boarding_pass document_check_or_checkbook document_currency_or_bill document_driving_license document_office_badge document_passport document_receipt document_social_security_number hier_curation hier_document hier_negative curation_meme file_name: SceneNet5_detection_labels-v8d.txt CVML_UNKNOWN_999999 aircraft automobile bicycle bird bottle bus canine consumer_electronics feline fruit furniture headgear kite fish computer_monitor motorcycle musical_instrument document people food sign watersport train ungulates watercraft flower appliance sports_equipment tool
4
2
1.6k
Jun ’24
AppIntentVocabulary (INPlayMediaIntent) is unstable.
I am developing an iOS app that supports INPlayMediaIntent. We are trying to increase the recognition rate of content names, which are song titles, using AppIntentVocabulary. As a sample, some extracts are shown below. <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>IntentPhrases</key> <array> <dict> <key>IntentName</key> <string>INPlayMediaIntent</string> <key>IntentExamples</key> <array> <string>Mezamashi Appで湖畔の朝を再生</string> <string>湖畔の朝をMezamashi Appで再生して</string> </array> </dict> </array> <key>ParameterVocabularies</key> <array> <dict> <key>ParameterNames</key> <array> <string>INPlayMediaIntent.playlistTitle</string> </array> <key>ParameterVocabulary</key> <array> <dict> <key>VocabularyItemIdentifier</key> <string>ID1</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPronunciation</key> <string>aogamagaeru</string> <key>VocabularyItemPhrase</key> <string>青ガマガエル</string> </dict> </array> </dict> <dict> <key>VocabularyItemIdentifier</key> <string>ID2</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPronunciation</key> <string>kohon no asa</string> <key>VocabularyItemPhrase</key> <string>湖畔の朝</string> </dict> </array> </dict> <dict> <key>VocabularyItemIdentifier</key> <string>ID3</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPronunciation</key> <string>kumageratachi no uta</string> <key>VocabularyItemPhrase</key> <string>クマゲラたちの歌</string> </dict> </array> </dict> </array> </dict> </array> </dict> </plist> When running on the iOS 17.5 simulator in Xcode 15.4, the results are as follows. mediaName = VocabularyItemIdentifier mediaIdentifier = nil <INMediaSearch: 0x6000026212c0> { reference = 0; mediaType = 0; sortOrder = 0; albumName = <null>; mediaName = ID1; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } However, when running on an iOS 17.5 device, the following applies. mediaName = VocabularyItemPhrase mediaIdentifier = VocabularyItemIdentifier <INMediaSearch: 0x301efd9e0> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = 青ガマガエル; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = ID1; } The results are not stable, for example, sometimes everything else returns null. I have tried everything, but it is just taking a long time. Does anyone have any advice on this?
1
0
493
Jun ’24
How to add support for Siri / Apple Intelligence to my existing AppEntity?
iOS 18 adds a specific macro for exposing your search app intent, app entities, etc, to siri but how are you meant to add it to your existing objects without removing it entirely from < iOS 18 users? For example, i get the following error: AssistantIntent(schema:) is only available in iOS 18 or newer. Add @available attribute to enclosing struct. I don't want to do that since i still want to support iOS 17 users with my existing shortcuts. Do i need to duplicate my entire shortcuts model to add the new macro?
1
0
984
Jun ’24
Siri enters loop of requesting parameter when running AppIntent
I want to add shortcut and Siri support using the new AppIntents framework. Running my intent using shortcuts or from spotlight works fine, as the touch based UI for the disambiguation is shown. However, when I ask Siri to perform this action, she gets into a loop of asking me the question to set the parameter. My AppIntent is implemented as following: struct StartSessionIntent: AppIntent { static var title: LocalizedStringResource = "start_recording" @Parameter(title: "activity", requestValueDialog: IntentDialog("which_activity")) var activity: ActivityEntity @MainActor func perform() async throws -> some IntentResult & ProvidesDialog { let activityToSelect: ActivityEntity = self.activity guard let selectedActivity = Activity[activityToSelect.name] else { return .result(dialog: "activity_not_found") } ... return .result(dialog: "recording_started \(selectedActivity.name.localized())") } } The ActivityEntity is implemented like this: struct ActivityEntity: AppEntity { static var typeDisplayRepresentation = TypeDisplayRepresentation(name: "activity") typealias DefaultQuery = MobilityActivityQuery static var defaultQuery: MobilityActivityQuery = MobilityActivityQuery() var id: String var name: String var icon: String var displayRepresentation: DisplayRepresentation { DisplayRepresentation(title: "\(self.name.localized())", image: .init(systemName: self.icon)) } } struct MobilityActivityQuery: EntityQuery { func entities(for identifiers: [String]) async throws -> [ActivityEntity] { Activity.all()?.compactMap({ activity in identifiers.contains(where: { $0 == activity.name }) ? ActivityEntity(id: activity.name, name: activity.name, icon: activity.icon) : nil }) ?? [] } func suggestedEntities() async throws -> [ActivityEntity] { Activity.all()?.compactMap({ activity in ActivityEntity(id: activity.name, name: activity.name, icon: activity.icon) }) ?? [] } } Has anyone an idea what might be causing this and how I can fix this behavior? Thanks in advance
3
3
1.4k
Jun ’23
Image Playground
Are there going to be any sessions on Image Playgrounds API for iOS? "Explore machine learning on Apple platforms" mentions the writing and points to sessions, but only mentions Image Playground without pointing to sessions.
0
6
671
Jun ’24
"Error: Intent of type INStartCallIntent is not supported for this app category"
I am trying to make a voip car play app using siri let assistant = CPAssistantCellConfiguration(position: .top, visibility: .always, assistantAction: .startCall) let siriTmeplate = CPListTemplate(title: "Siri", sections: [sectionItems, loadingSection], assistantCellConfiguration: assistant) siriTmeplate.tabSystemItem = .recents siriTmeplate.showsTabBadge = false Using the above code gives me the error "Error: Intent of type INStartCallIntent is not supported for this app category" on app luanch I have INStartCallIntent in my apps info plist and I have all the entitlements and I have "business" as the app category, I can fine 0 help online with this. what does this error really mean and how can I fix it please
2
0
629
Jun ’24
"accelerate everyday tasks" in apps without intents?
From https://www.apple.com/newsroom/2024/06/introducing-apple-intelligence-for-iphone-ipad-and-mac/: Powered by Apple Intelligence, Siri becomes more deeply integrated into the system experience. With richer language-understanding capabilities, Siri is more natural, more contextually relevant, and more personal, with the ability to simplify and accelerate everyday tasks. From https://developer.apple.com/apple-intelligence/: Siri is more natural, more personal, and more deeply integrated into the system. Apple Intelligence provides Siri with enhanced action capabilities, and developers can take advantage of pre-defined and pre-trained App Intents across a range of domains to not only give Siri the ability to take actions in your app, but to make your app’s actions more discoverable in places like Spotlight, the Shortcuts app, Control Center, and more. SiriKit adopters will benefit from Siri’s enhanced conversational capabilities with no additional work. And with App Entities, Siri can understand content from your app and provide users with information from your app from anywhere in the system. Based on this, as well as the video at https://developer.apple.com/videos/play/wwdc2024/10133/ , my understanding is that in order for Siri to be able to execute tasks in applications, those applications must implement the Siri Intents API. Can someone at Apple please clarify: will it be possible for Siri or some other aspect of Apple Intelligence / Core ML / Create ML to take actions in applications which do not support these APIs (e.g. web apps, Citrix apps, legacy apps)? Thank you!
2
1
727
Jun ’24
Custom Dataset Training With YOLO model on Mac M1 Pro
Hi I have only recently started working on ML on my Mac M1 Pro, previously I was working on a Windows platform. I am having difficulties getting my machine set up right so that its ready for the super fast training I was hoping for when I got it. Please help me with this and let me know if and where I am going wrong. So, I tried a custom dataset training using Yolov8 model. I want to train for a 100 epochs. Now the same dataset and hyperparameters take about 2.5 hours on a T4 GPU on Google Colab, whereas I was only at around 60 epochs after 24 hours on my M1 pro. I have home brew, miniconda, pytorch nightly for mac installed and set the device to mps when training the YOLO model. I feel that this is reaaallly slow. What should I be doing right? Thank you Lakshmi
0
1
669
May ’24
jax installation
followed instruction in https://developer.apple.com/metal/jax/ I got Successfully installed importlib-metadata-7.1.0 jax-0.4.28 jax-metal-0.0.7 jaxlib-0.4.28 opt-einsum-3.3.0 scipy-1.13.0 six-1.16.0 zipp-3.18.2 but the test failed python -c 'import jax; print(jax.numpy.arange(10))' Traceback (most recent call last): File "", line 1, in File "/Users/erivas/jax-metal/lib/python3.9/site-packages/jax/init.py", line 37, in import jax.core as _core File "/Users/erivas/jax-metal/lib/python3.9/site-packages/jax/core.py", line 18, in from jax._src.core import ( File "/Users/erivas/jax-metal/lib/python3.9/site-packages/jax/_src/core.py", line 39, in from jax._src import dtypes File "/Users/erivas/jax-metal/lib/python3.9/site-packages/jax/_src/dtypes.py", line 33, in from jax._src import config File "/Users/erivas/jax-metal/lib/python3.9/site-packages/jax/_src/config.py", line 27, in from jax._src import lib File "/Users/erivas/jax-metal/lib/python3.9/site-packages/jax/_src/lib/init.py", line 84, in cpu_feature_guard.check_cpu_features() RuntimeError: This version of jaxlib was built using AVX instructions, which your CPU and/or operating system do not support. You may be able work around this issue by building jaxlib from source.
0
0
872
May ’24
Crash due to SNMLModelFactory "try!" expression
Hello there, We currently have a crash in prod when executing the following line: let classificationRequest = try SNClassifySoundRequest(classifierIdentifier: .version1) It appears to only happen on iOS 17+ and only when regaining audio focus after an interruption in a background state. We are aware this call probably fails because it is happening from a background state - however - I would expect then that the SNClassifySoundRequest throws some kind of error since it is already an initializer that throws. If it is possible for the call to fail under certain circumstances, then could SNMLModelFactory throw an error instead of using try! ? Full trace below: SoundAnalysis/SNMLModelFactory.swift:112: Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=com.apple.CoreML Code=0 "Failed to build the model execution plan using a model architecture file '/System/Library/Frameworks/SoundAnalysis.framework/SNSoundClassifierVersion1Model.mlmodelc/model1/model.espresso.net' with error code: -1." UserInfo={NSLocalizedDescription=Failed to build the model execution plan using a model architecture file '/System/Library/Frameworks/SoundAnalysis.framework/SNSoundClassifierVersion1Model.mlmodelc/model1/model.espresso.net' with error code: -1.}
1
3
702
May ’24
Cannot assign a device for operation RandomUniform M3 Macbook pro 14.4.1
Cannot assign a device for operation encoder/down1/downs_0/conv1/weight/Initializer/random_uniform/RandomUniform: Could not satisfy explicit device specification '' because the node {{colocation_node encoder/down1/downs_0/conv1/weight/Initializer/random_uniform/RandomUniform}} was colocated with a group of nodes that required incompatible device '/device:GPU:0'. All available devices [/job:localhost/replica:0/task:0/device:CPU:0, /job:localhost/replica:0/task:0/device:GPU:0]. Colocation Debug Info: Colocation group had the following types and supported devices: Root Member(assigned_device_name_index_=-1 requested_device_name_='/device:GPU:0' assigned_device_name_='' resource_device_name_='/device:GPU:0' supported_device_types_=[CPU] possible_devices_=[] Identity: GPU CPU Mul: GPU CPU AddV2: GPU CPU Sub: GPU CPU RandomUniform: GPU CPU Assign: CPU VariableV2: GPU CPU Const: GPU CPU
0
0
493
May ’24
jax-metal fails to install on M1 clean environment
Hi all, I'm having trouble even getting jax-metal latest version to install on my M1 MacBook Pro. In a clean conda environment, I pip install jax-metal and get In [1]: import jax; print(jax.numpy.arange(10)) Platform 'METAL' is experimental and not all JAX functionality may be correctly supported! --------------------------------------------------------------------------- XlaRuntimeError Traceback (most recent call last) [... skipping hidden 1 frame] File ~/opt/anaconda3/envs/metal/lib/python3.11/site-packages/jax/_src/xla_bridge.py:977, in _init_backend(platform) 976 logger.debug("Initializing backend '%s'", platform) --> 977 backend = registration.factory() 978 # TODO(skye): consider raising more descriptive errors directly from backend 979 # factories instead of returning None. File ~/opt/anaconda3/envs/metal/lib/python3.11/site-packages/jax/_src/xla_bridge.py:666, in register_plugin.<locals>.factory() 665 if not xla_client.pjrt_plugin_initialized(plugin_name): --> 666 xla_client.initialize_pjrt_plugin(plugin_name) 667 updated_options = {} File ~/opt/anaconda3/envs/metal/lib/python3.11/site-packages/jaxlib/xla_client.py:176, in initialize_pjrt_plugin(plugin_name) 169 """Initializes a PJRT plugin. 170 171 The plugin needs to be loaded first (through load_pjrt_plugin_dynamically or (...) 174 plugin_name: the name of the PJRT plugin. 175 """ --> 176 _xla.initialize_pjrt_plugin(plugin_name) XlaRuntimeError: INVALID_ARGUMENT: Mismatched PJRT plugin PJRT API version (0.47) and framework PJRT API version 0.51). During handling of the above exception, another exception occurred: RuntimeError Traceback (most recent call last) Cell In[1], line 1 ----> 1 import jax; print(jax.numpy.arange(10)) File ~/opt/anaconda3/envs/metal/lib/python3.11/site-packages/jax/_src/numpy/lax_numpy.py:2952, in arange(start, stop, step, dtype) 2950 ceil_ = ufuncs.ceil if isinstance(start, core.Tracer) else np.ceil 2951 start = ceil_(start).astype(int) # type: ignore -> 2952 return lax.iota(dtype, start) 2953 else: 2954 if step is None and start == 0 and stop is not None: File ~/opt/anaconda3/envs/metal/lib/python3.11/site-packages/jax/_src/lax/lax.py:1282, in iota(dtype, size) 1277 def iota(dtype: DTypeLike, size: int) -> Array: 1278 """Wraps XLA's `Iota 1279 <https://www.tensorflow.org/xla/operation_semantics#iota>`_ 1280 operator. 1281 """ -> 1282 return broadcasted_iota(dtype, (size,), 0) File ~/opt/anaconda3/envs/metal/lib/python3.11/site-packages/jax/_src/lax/lax.py:1292, in broadcasted_iota(dtype, shape, dimension) 1289 static_shape = [None if isinstance(d, core.Tracer) else d for d in shape] 1290 dimension = core.concrete_or_error( 1291 int, dimension, "dimension argument of lax.broadcasted_iota") -> 1292 return iota_p.bind(*dynamic_shape, dtype=dtype, shape=tuple(static_shape), 1293 dimension=dimension) File ~/opt/anaconda3/envs/metal/lib/python3.11/site-packages/jax/_src/core.py:387, in Primitive.bind(self, *args, **params) 384 def bind(self, *args, **params): 385 assert (not config.enable_checks.value or 386 all(isinstance(arg, Tracer) or valid_jaxtype(arg) for arg in args)), args --> 387 return self.bind_with_trace(find_top_trace(args), args, params) File ~/opt/anaconda3/envs/metal/lib/python3.11/site-packages/jax/_src/core.py:391, in Primitive.bind_with_trace(self, trace, args, params) 389 def bind_with_trace(self, trace, args, params): 390 with pop_level(trace.level): --> 391 out = trace.process_primitive(self, map(trace.full_raise, args), params) 392 return map(full_lower, out) if self.multiple_results else full_lower(out) File ~/opt/anaconda3/envs/metal/lib/python3.11/site-packages/jax/_src/core.py:879, in EvalTrace.process_primitive(self, primitive, tracers, params) 877 return call_impl_with_key_reuse_checks(primitive, primitive.impl, *tracers, **params) 878 else: --> 879 return primitive.impl(*tracers, **params) File ~/opt/anaconda3/envs/metal/lib/python3.11/site-packages/jax/_src/dispatch.py:86, in apply_primitive(prim, *args, **params) 84 prev = lib.jax_jit.swap_thread_local_state_disable_jit(False) 85 try: ---> 86 outs = fun(*args) 87 finally: 88 lib.jax_jit.swap_thread_local_state_disable_jit(prev) [... skipping hidden 17 frame] File ~/opt/anaconda3/envs/metal/lib/python3.11/site-packages/jax/_src/xla_bridge.py:902, in backends() 900 else: 901 err_msg += " (you may need to uninstall the failing plugin package, or set JAX_PLATFORMS=cpu to skip this backend.)" --> 902 raise RuntimeError(err_msg) 904 assert _default_backend is not None 905 if not config.jax_platforms.value: RuntimeError: Unable to initialize backend 'METAL': INVALID_ARGUMENT: Mismatched PJRT plugin PJRT API version (0.47) and framework PJRT API version 0.51). (you may need to uninstall the failing plugin package, or set JAX_PLATFORMS=cpu to skip this backend.) jax.__version__ is 0.4.27.
4
0
1.2k
May ’24
Mismatch between CPU inference and GPU inference using metal-trained GRU
Regardless of the installation version combinations of tensorflow & metal (2.14, 2.15, 2.16), I find a metal/non-metal incompatibility for some layer types. For the GRU layer, for example, metal-trained weights (model.save_weights()/load_weights()) are not compatible with inference using the CPU. That is, train a model using metal, run inference using metal, then run inference again after uninstalling metal, and the results differ -- sometimes a night and day difference. This essentially eliminates the usefulness of tensorflow-metal for me. From my limited testing, models using other, simple combinations of layer types including Dense and LSTM do not show this problem. Just the GRU. And by "testing" I mean really simple models, like one GRU layer. Apple Framework Metal Team: You are doing very useful work, and I kindly ask, please address this bug :)
1
1
561
May ’24
Tensor Metal Plugin installation error
I using a Macbook pro with an m2 pro chip. I was trying to work with TensorFlow but I encountered an illegal hardware instruction error. To resolve it I initiated the installation of a metal plugin which is throwing the following error. or semicolon (after version specifier) awscli>=1.16.100boto3>=1.9.100 ~~~~~~~~~~~^ Unable to locate awscli [end of output]
0
0
607
Apr ’24
Pose Estimation
Hello! I am developing an app that leverages Apple's 2D pose estimation model and I would love to speak with someone about if my mobile app should leverage Apple's 3D pose estimation model. Also, I would love to know if Apple considers adding more points on the body as this would be incredibly helpful. Or if it is possible for me to train the model to add more body points. Thanks so much and please let me know if anyone is available to discuss.
0
0
202
Apr ’24