Hi everyone,
We have a user experiencing a display issue. Here's a screenshot they shared with us. Unlike in the simulator, where we see three icons, their display shows two buttons abbreviated with ellipses. The device is iPhone 12 mini with iOS 17.6.1.
The user isn't using any accessibility settings or large text size. Does anyone know what setting might be causing this? Any advice would be appreciated!
Thanks!
General
RSS for tagExplore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Post
Replies
Boosts
Views
Activity
I'd like an effect similar to the iOS 18 Siri, with a border/stroke extending around the view including the corner radius of the screen.
This is challenging as "logically" the view is rectangular and those radii don't really exist.
I've seen some posts around the web about _displayCornerRadius which appears to be what would work but it is private and as far as I can tell never mentioned anywhere else.
Does anyone know of a way to achieve a border around a view that correctly wraps around the corners of the rounded screen?
Currently using SwiftUI but open to any solutions.
Third-party WidgetKit complications on watchOS 11 beta 5 are not appearing in the list of available complications. They have also disappeared from watch faces where they were installed. The exact same complications were working fine on earlier betas. This is happening on device, but not in simulator.
This issue may be related to FB14684253, which was fixed with the release of Xcode beta 5. However, Xcode beta 5 does not fix the issue on Apple Watch.
As a sanity check, I also tried with the Backyard Birds sample project, and the complications for that app aren't appearing on device either.
Filed as FB14689021.
My app exports a custom file type identifier for a file that it exports to share between other users of the same app. However in the list of apps in UIActivityViewController view, my app is listed far off screen. How can I make my app which owns the file type appear first in the list, instead of seeing many irrelevant apps that can't actually open my file?
Plist snippets:
<key>CFBundleDocumentTypes</key>
<array>
<dict>
<key>CFBundleTypeName</key>
<string>My Custom App File</string>
<key>LSHandlerRank</key>
<string>Owner</string>
<key>LSItemContentTypes</key>
<array>
<string>abc.myapp.myextension</string>
</array>
</dict>
</array>
<key>UTExportedTypeDeclarations</key>
<array>
<dict>
<key>UTTypeConformsTo</key>
<array>
<string>public.content</string>
<string>public.data</string>
</array>
<key>UTTypeDescription</key>
<string>My Custom App File</string>
<key>UTTypeIconFiles</key>
<array/>
<key>UTTypeIdentifier</key>
<string>abc.myapp.myextension</string>
<key>UTTypeTagSpecification</key>
<dict>
<key>public.filename-extension</key>
<array>
<string>myextension</string>
</array>
<key>public.mime-type</key>
<array>
<string>application/octet-stream</string>
</array>
</dict>
</dict>
</array>
How can we performantly scroll to a target location using TextKit 2?
Hi everyone,
I'm building a custom text editor using TextKit 2 and would like to scroll to a target location efficiently. For instance, I would like to move to the end of a document seamlessly, similar to how users can do in standard text editors by using CMD + Down.
Background:
NSTextView and TextEdit on macOS can navigate to the end of large documents in milliseconds. However, after reading the documentation and experimenting with various ideas using TextKit 2's APIs, it's not clear how third-party developers are supposed to achieve this.
My Code:
Here's the code I use to move the selection to the end of the document and scroll the viewport to reveal the selection.
override func moveToEndOfDocument(_ sender: Any?) {
textLayoutManager.ensureLayout(for: textLayoutManager.documentRange)
let targetLocation = textLayoutManager.documentRange.endLocation
let beforeTargetLocation = textLayoutManager.location(targetLocation, offsetBy: -1)!
textLayoutManager.textViewportLayoutController.layoutViewport()
guard let textLayoutFragment = textLayoutManager.textLayoutFragment(for: beforeTargetLocation) else {
return
}
guard let textLineFragment = textLayoutFragment.textLineFragment(for: targetLocation, isUpstreamAffinity: true) else {
return
}
let lineFrame = textLayoutFragment.layoutFragmentFrame
let lineFragmentFrame = textLineFragment.typographicBounds.offsetBy(dx: 0, dy: lineFrame.minY)
scrollToVisible(lineFragmentFrame)
}
While this code works as intended, it is very inefficient because ensureLayout(_:) is incredibly expensive and can take seconds for large documents.
Issues Encountered:
In my attempts, I have come across the following two issues.
Estimated Frames: The frames of NSTextLayoutFragment and NSTextLineFragment are approximate and not precise enough for scrolling unless the text layout fragment has been fully laid out.
Laying out all text is expensive: The frames become accurate once NSTextLayoutManager's ensureLayout(for:) method has been called with a range covering the entire document. However, ensureLayout(for:) is resource-intensive and can take seconds for large documents. NSTextView, on the other hand, accomplishes the same scrolling to the end of a document in milliseconds.
I've tried using NSTextViewportLayoutController's relocateViewport(to:) without success. It's unclear to me whether this function is intended for a use case like mine. If it is, I would appreciate some guidance on its proper usage.
Configuration:
I'm testing on macOS Sonoma 14.5 (23F79), Swift (AppKit), Xcode 15.4 (15F31d).
I'm working on a multi-platform project written in AppKit and UIKit, so I'm looking for either a single solution that works in both AppKit and UIKit or two solutions, one for each UI framework.
Question:
How can third-party developers scroll to a target location, specifically the end of a document, performantly using TextKit 2?
Steps to Reproduce:
The issue can be reproduced using the example project (download from link below) by following these steps:
Open the example project.
Run the example app on a Mac. The example app shows an uneditable text view in a scroll view. The text view displays a long text.
Press the "Move to End of Document" toolbar item.
Notice that the text view has scrolled to the bottom, but this took several seconds (~3 seconds on my MacBook Pro 16-inch, 2021). The duration will be shown in Xcode's log.
You can open the ExampleTextView.swift file and find the implementation of moveToEndOfDocument(_:). Comment out line 84 where the ensureLayout(_:) is called, rerun the app, and then select "Move to End of Document" again. This time, you will notice that the text view moves fast but does not end up at the bottom of the document.
You may also open the large-file.json in the project, the same file that the example app displays, in TextEdit, and press CMD+Down to move to the end of the document. Notice that TextEdit does this in mere milliseconds.
Example Project:
The example project is located on GitHub:
https://github.com/simonbs/apple-developer-forums/tree/main/how-can-we-performantly-scroll-to-a-target-location-using-textkit-2
Any advice or guidance on how to achieve this with TextKit 2 would be greatly appreciated.
Thanks in advance!
Best regards,
Simon
It seems that when an entity has and ordered to-many relationship to the same entity, inserting an object into the ordered set causes other objects of the set to turn into faults during the next save of the managed object context.
I verified it with several applications.
For the sake of example, the entity will be called Folder and the ordered to-many relationship subfolders (an NSOrdereset), with a cascade delete rule. The reciprocal to-one relationship is called parent.
Assuming you have a Folder object with two subfolders, removing the last subfolder from the set (setting its parent to nil) and reinserting it at index 0 with insertObject:<>inSubfoldersAtIndex:0 will turn the other subfolder into a fault at the next save.
Now assuming that other folder has a name attribute (NSString) that is bound to a textfield in your UI, the name of that subfolder will disappear when the context saves, since it becomes nil while the subfolder is turned into a fault.
Is this expected behavior?
Note: I'm using Objective C, Xcode 15 and macOS sonoma, but I've seen this issue occur on previous macOS versions.
My app started crashing a ton with Xcode 16 beta 1 / iOS 18 because of "Thread 1: Fatal error: Never access a full future backing data". I was hoping this would be resolved with beta 2, but unfortunately this is not the case. I'm having a tough time reproducing this bug in a small sample project – I'd appreciate any hints as to what might be causing this.
Full error:
Thread 1: Fatal error: Never access a full future backing data - PersistentIdentifier(id: SwiftData.PersistentIdentifier.ID(url: x-coredata://10A5A93C-DC7F-40F3-92DB-F4125E1C7A73/MyType/p2), implementation: SwiftData.PersistentIdentifierImplementation) with Optional(3BF44A2D-256B-4C40-AF40-9B7518FD9FE6)
My app uses Core Data and has enabled App Groups for data sharing between the App and Widget.
In my app, there's a Core Data entity called Task. As per the documentation's suggestion, I've separately implemented a TaskData struct that conforms to AppEntity. I've also implemented TaskDataQuery: EntityQuery, which includes a method called suggestedEntities. This method fetches all Tasks from the main context and uses Task.toTaskData.
Following the documentation, I've implemented the corresponding WidgetConfigurationIntent, which holds:
@Parameter(title: "Task")
var task: TaskData
as well as the corresponding AppIntentTimelineProvider to implement the provider.
I haven't encountered any retrieval issues on the simulator; everything works perfectly.
However, the problem arises when I deploy to a physical device. Users report that their widgets can't retrieve any data. Specifically, when users long-press the widget to set up a task, it remains in a Loading state, unable to fetch any Core Data.
I've looked through some resources, and it seems this might be a common issue with iOS 17?
How can I resolve this issue? Has anyone encountered this or can offer any suggestions? This has been troubling me for several days now, and it's causing my product to continually lose users. I'm really upset about it.
Any advice is welcome.
In a music streaming app, when using Activity.request to activate the Dynamic Island, the system’s Now Playing interface appears correctly. However, the app's live activities, lock screen, and other related features fail to display properly.
During debugging, the following code is used:
activity = try Activity.request(attributes: attributes, contentState: contentState, pushType: .token)
if !NMABTestManager.default().is(inTest: "FH-NewLiveActivityPush") {
// Listen to push token updates
if activity != nil {
tokenUpdatesTask?.cancel()
tokenUpdatesTask = Task.detached {
for await tokenData in self.activity!.pushTokenUpdates {
let mytoken = tokenData.map { String(format: "%02x", $0) }.joined().uppercased()
// pushToken is Data, needs to be converted to String using the above method before being passed to the server
self.pushToken = mytoken
}
}
}
}
} catch (let error) {
print("Error Starting Live Activity: \(error.localizedDescription)")
}
In this scenario, the push token is returned correctly, and no errors are triggered.
This issue did not occur in iOS 17 but appears sporadically in iOS 18. Once it occurs, it cannot be resolved through restarting or other means.
feedbackid:FB14763873, i upload my sysdisagnose
I am writing to inquire if there is any way to programmatically check whether a user has enabled the “Large App Icon” mode in iOS 18. Our development team is working on optimizing our app’s user interface, and it would be beneficial to adapt the design based on this setting.
Any guidance on how to access this information, or if it’s even possible within the current iOS APIs, would be greatly appreciated.
Thank you for your time and assistance.
Hello,
I've noticed a few rare crashes with the following stacktrace reported on AppStore connect:
Hardware Model: iPhone16,2
AppStoreTools: 15F31e
AppVariant: 1:iPhone16,2:17.4
OS Version: iPhone OS 17.5.1 (21F90)
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: SIGNAL 6 Abort trap: 6
Terminating Process: <App> [15575]
Triggered by Thread: 0
Last Exception Backtrace:
0 CoreFoundation 0x1a3d38f20 __exceptionPreprocess + 164 (NSException.m:249)
1 libobjc.A.dylib 0x19bbbe018 objc_exception_throw + 60 (objc-exception.mm:356)
2 Foundation 0x1a323f868 -[NSAssertionHandler handleFailureInMethod:object:file:lineNumber:description:] + 188 (NSException.m:252)
3 CoreAutoLayout 0x1c4eabcc8 -[NSLayoutConstraint _setSymbolicConstant:constant:symbolicConstantMultiplier:] + 552 (NSLayoutConstraint.m:669)
4 CoreAutoLayout 0x1c4eab674 -[NSLayoutConstraint setConstant:] + 96 (NSLayoutConstraint.m:750)
5 <App> 0x10486d578 closure #1 in BaseChatTableViewCell.requestPreview(for:with:) + 540 (BaseChatTableViewCell+File.swift:162)
6 <App> 0x10486d73c thunk for @escaping @callee_guaranteed (@in_guaranteed URLRequest, @guaranteed NSHTTPURLResponse?, @guaranteed UIImage) -> () + 164 (<compiler-generated>:0)
7 <App> 0x104c4f814 __85-[UIImageView(AFNetworking) setImageWithURLRequest:placeholderImage:success:failure:]_block_invoke + 176 (UIImageView+AFNetworking.m:118)
8 <App> 0x104c3cc74 __78-[AFImageDownloader downloadImageForURLRequest:withReceiptID:success:failure:]_block_invoke.88 + 52 (AFImageDownloader.m:276)
9 libdispatch.dylib 0x1abbdc13c _dispatch_call_block_and_release + 32 (init.c:1530)
10 libdispatch.dylib 0x1abbdddd4 _dispatch_client_callout + 20 (object.m:576)
11 libdispatch.dylib 0x1abbec5a4 _dispatch_main_queue_drain + 988 (queue.c:7898)
12 libdispatch.dylib 0x1abbec1b8 _dispatch_main_queue_callback_4CF + 44 (queue.c:8058)
13 CoreFoundation 0x1a3d0b710 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16 (CFRunLoop.c:1780)
14 CoreFoundation 0x1a3d08914 __CFRunLoopRun + 1996 (CFRunLoop.c:3149)
15 CoreFoundation 0x1a3d07cd8 CFRunLoopRunSpecific + 608 (CFRunLoop.c:3420)
16 GraphicsServices 0x1e8bb81a8 GSEventRunModal + 164 (GSEvent.c:2196)
17 UIKitCore 0x1a634090c -[UIApplication _run] + 888 (UIApplication.m:3713)
18 UIKitCore 0x1a63f49d0 UIApplicationMain + 340 (UIApplication.m:5303)
19 <App> 0x1047c6c20 main + 80 (main.m:11)
20 dyld 0x1c73b9e4c start + 2240 (dyldMain.cpp:1298)
It crashes on the last line of
let previewSize = BaseChatTableViewCell.getPreviewSize(from: imageSize, isMediaFile)
self.filePreviewImageViewHeightConstraint?.constant = previewSize.height
self.filePreviewImageViewWidthConstraint?.constant = previewSize.width
where previewSize is a CGSize. I am unable to reproduce the crash, nor am I able to understand why it crashes there. Anyone got an idea what could cause a crash on setting a constant?
Why does PDFKit delete signature widgets that have already been digitally signed?
This should not happen.
Is there an undocumented flag that needs to be set so that PDFKit doesn't remove them when loading or saving the PDF?
It's difficult to tell if it is happening at
PDFDocument(url: fileURL)
or
document.write(to: outputURL)
If a document is signed and still allows annotations, form filling, comments, etc. we should be able to load the PDF into a PDFDocument and save it without losing the certs.
Instead the certs are gone and only the signature annotation widgets are present.
Here is a simple example of loading and then saving the PDF with out any changes and it shows that the data is actually being changed...
...
import UIKit
import PDFKit
class ViewController: UIViewController {
var pdfView: PDFView!
@IBOutlet weak var myButton: UIButton!
override func viewDidLoad() {
super.viewDidLoad()
pdfView = PDFView(frame: self.view.bounds)
pdfView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
self.view.addSubview(pdfView)
self.view.bringSubviewToFront(myButton)
// Load and compare the PDF data
if let originalData = loadPDF() {
if let loadedData = getRawDataFromLoadedPDF() {
let isDataEqual = comparePDFData(originalData, loadedData)
print("Is original data equal to loaded data? \(isDataEqual)")
}
}
}
@IBAction func onSave(_ sender: Any) {
if let savedData = savePDF() {
if let originalData = loadPDF() {
let isDataEqual = comparePDFData(originalData, savedData)
print("Is original data equal to saved data? \(isDataEqual)")
}
}
}
func loadPDF() -> Data? {
guard let fileURL = Bundle.main.url(forResource: "document", withExtension: "pdf") else {
print("Error: document.pdf not found in bundle.")
return nil
}
do {
let originalData = try Data(contentsOf: fileURL)
if let document = PDFDocument(url: fileURL) {
pdfView.document = document
print("PDF loaded successfully.")
return originalData
} else {
print("Error: Unable to load PDF document.")
return nil
}
} catch {
print("Error reading PDF data: \(error)")
return nil
}
}
func getRawDataFromLoadedPDF() -> Data? {
guard let document = pdfView.document else {
print("Error: No document is currently loaded in pdfView.")
return nil
}
if let data = document.dataRepresentation() {
return data
} else {
print("Error: Unable to get raw data from loaded PDF document.")
return nil
}
}
func comparePDFData(_ data1: Data, _ data2: Data) -> Bool {
return data1 == data2
}
func savePDF() -> Data? {
guard let document = pdfView.document else {
print("Error: No document is currently loaded in pdfView.")
return nil
}
let fileManager = FileManager.default
let urls = fileManager.urls(for: .documentDirectory, in: .userDomainMask)
guard let documentsURL = urls.first else {
print("Error: Could not find the documents directory.")
return nil
}
let outputURL = documentsURL.appendingPathComponent("document_out.pdf")
if document.write(to: outputURL) {
print("PDF saved successfully to \(outputURL.path)")
do {
let savedData = try Data(contentsOf: outputURL)
return savedData
} catch {
print("Error reading saved PDF data: \(error)")
return nil
}
} else {
print("Error: Unable to save PDF document.")
return nil
}
}
}
I went to update to Apple Intelligence Beta 15.1 on Sequoia yesterday when it dropped, and I’m still waitlisted. The weird thing is, I had it on my Mac within a few minutes after downloading the beta, but then restarted the Mac and it now says “joined waitlist” and still does today.
My app’s WidgetKit widgets are all crashing on iOS 18 beta 5. They were working just fine on earlier betas. This is happening across both Home and Lock Screen widgets. It's an EXC_BAD_ACCESS crash that seems to be happening deep within WidgetKit.
I've seen other developers posting about this on social media, so it's not just me. Wanted to get this flagged ASAP as it's very late in the beta cycle now...
Filed as FB14684253.
It's the same as the title, and when I checked the log, there were hundreds to thousands of lines with the following content.
The larger the page index of the Pdf from which you select letters, the more logs will be recorded.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.
.
.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-KR1-UCS2'.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-KR1-UCS2'.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-Identity-UCS2'.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-KR1-UCS2'.
can't create CMap Adobe-Identity-UCS2'. can't create CMap Adobe-Identity-UCS2'.
can't create CMap `Adobe-Identity-UCS2'.
I don't know why this is happening. Is there a solution?
I am currently running iOS 18 Beta 3 and am working on enabling users to paste (and copy) custom emojis (AdaptiveImageGlyph, such as Memoji, Stickers, and soon GenMoji) into a text field.
I am looking for the UTI for AdaptiveImageGlyph—something similar to "public.adaptive-image-glyph". Does anyone know if such a UTI exists?
Here’s my situation: When typing AdaptiveImageGlyph using the System keyboard, everything functions correctly. However, if I copy some text containing AdaptiveImageGlyph from the Notes app and paste it into my playground app, it only pastes the text. The reverse is also true. In fact, if I copy some AdaptiveImageGlyph from the playground app and paste it, it only pasts the text.
Interestingly, copying AdaptiveImageGlyph from the Notes app and pasting it into iMessage works flawlessly, and vice versa. I am trying to achieve the same seamless functionality in my app.
Given that this feature works in iMessage and Notes, I am inclined to believe the issue might be on my side, though I recognize these are system apps and not third-party.
Example Code:
import SwiftUI
import UIKit
struct AdaptiveImageGlyphTextView: UIViewRepresentable {
class Coordinator: NSObject, UITextViewDelegate {
var parent: AdaptiveImageGlyphTextView
init(parent: AdaptiveImageGlyphTextView) {
self.parent = parent
}
func textViewDidChange(_ textView: UITextView) {
parent.text = textView.text
}
func textView(_ textView: UITextView, shouldChangeTextIn range: NSRange, replacementText text: String) -> Bool {
// Handle insertion of adaptive image glyphs here if needed
return true
}
}
@Binding var text: String
func makeCoordinator() -> Coordinator {
Coordinator(parent: self)
}
func makeUIView(context: Context) -> UITextView {
let textView = UITextView()
textView.delegate = context.coordinator
textView.supportsAdaptiveImageGlyph = true
textView.isEditable = true
textView.isSelectable = true
textView.font = UIFont.systemFont(ofSize: 17)
// Enable paste with NSAdaptiveImageGlyphs
textView.pasteConfiguration = UIPasteConfiguration(acceptableTypeIdentifiers: [
"public.text",
"public.image",
"public.adaptive-image-glyph" // Replace with the correct UTI if different
])
return textView
}
func updateUIView(_ uiView: UITextView, context: Context) {
if uiView.text != text {
uiView.text = text
}
}
}
struct ContentView: View {
@State private var text: String = ""
var body: some View {
AdaptiveImageGlyphTextView(text: $text)
.frame(height: 200)
.padding()
}
}
#Preview {
ContentView()
}
I'm trying to develop a Live Activity Extension. The problem is, I can't get pushToStartToken. I'm able to get it when I start a Live Activity, but I can't when I don't start a Live Activity.
This function successfully generates the token:
private func startNewLiveActivity() async {
guard #available(iOS 16.2, *) else { return }
let attributes = MyWidgetAttributes(
homeTeam: "Badger",
awayTeam: "Lion",
date: "12/09/2023"
)
let initialContentState = ActivityContent(
state: MyWidgetAttributes.ContentState(
homeTeamScore: 0,
awayTeamScore: 0,
lastEvent: "Match Start"
),
staleDate: nil
)
guard let activity = try? Activity.request(
attributes: attributes,
content: initialContentState,
pushType: .token
) else { return }
if #available(iOS 17.2, *) {
Task {
for await data in Activity< MyWidgetAttributes>.pushToStartTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
// THE DESIRED pushToStartToken TOKEN IS GENERATED HERE
}
}
}
for await data in activity.pushTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
// I send token to server here.
}
}
But when I try to get pushToStartToken separately, without creating a live activity, it doesn't return any value:
private func getPushToStartToken() async {
guard #available(iOS 17.2, *) else { return }
Task {
for await data in Activity<MyWidgetAttributes>.pushToStartTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
// THIS DOESN'T GENERATE ANY TOKENS SINCE THE ACTIVITY IS NOT CREATED
}
}
}
I'm trying to build a Live Activity extension. I can successfully start my live activity via push notification. The problem is, when I start live activity from my app, I can get pushTokenUpdates since I control everything and run the for loop that gets pushTokenUpdates. But the code that gets pushTokenUpdates isn't called when I start live activity with push notification since system starts it automatically (maybe it is called, but I don't know when and where).
Where am I supposed to get pushTokenUpdates when I start Live Activity using push notification to send them to my server?
The relevant code is below:
for await data in activity.pushTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
Logger().log("Activity token: \(token)")
// send token to the server
}
Hello my appclip is showing as unavailable in the app clip card. Until earlier this afternoon my appclip experience was showing correctly with an open action. All of a sudden it started to fail. I did not submit a new version nor updated the association file.
To reproduce scan qr.netflix.com/C/123
Good news bad news.
Good news- I have built my first app clip! After getting our app submission accepted, it gave us a working "default app clip url" in which successfully launches our app clip card and app clip.
Bad news- All this work was done to associate our app clip link with our website, so we could have a very clean URL, but that url is not launching our app clip card or the clip. Everything points to it looking good:
Diagnostics on apple developer settings are all green checkmarks: associated domains, app clip published on app store, smart app banner.
My associated domain url is "validated" on app store connect
My website has a smart app banner with meta tag with bundle identifier, and a open graph photo configured.
My app clip has the domain in it's entitlements file
What I'm expecting: Sending a text with the website's url should show me my app clip card, and not open my website, instead the app clip.
I shouldn't need to configure an advanced app clip experience because it's just via Messenger.. right? According to the documentation, advanced experiences should be for maps, qr codes, etc, right?
From what it seems... everything is set up completely right... so how come when I send myself a text message with the website's URL, it's not popping up with the app clip card?