Hello @slufter, you can create a haptic event from an audio file by registering an audio resource with registerAudioResource(_:options:), then using this audio resource to initialize a CHHapticEvent
with init(audioResourceID:parameters:relativeTime:).
Going from a haptic to an audio representation is arguably a more nuanced problem in the sense that a haptic representation can yield an arbitrary number of audio representations, just like the same rhythmic pattern can be present in completely different melodies.
You mentioned going from the Apple Haptic and Audio Pattern (AHAP) file format to MIDI, then using the MIDI messages to drive synthesizers. The missing piece here is that the MIDI representation would still lack a concept of pitch.
Take, for example, the two haptic events below, taken from the Playing a Custom Haptic Pattern from a File sample code project:
{
"Event": {
"Time": 0.0,
"EventType": "HapticTransient",
"EventParameters": [
{ "ParameterID": "HapticIntensity", "ParameterValue": 0.8 },
{ "ParameterID": "HapticSharpness", "ParameterValue": 0.4 }
]
}
},
{
"Event": {
"Time": 0.015,
"EventType": "HapticContinuous",
"EventDuration": 0.25,
"EventParameters": [
{ "ParameterID": "HapticIntensity", "ParameterValue": 0.8 },
{ "ParameterID": "HapticSharpness", "ParameterValue": 0.4 }
]
}
}
Since AHAP is a JSON-like file format, you can use JSONDecoder or JSONSerialization to read an AHAP file. You can then use the time, event type, duration and intensity information for each event, for example, to create instances of AVMIDINoteEvent, which you can then add to an AVMusicTrack
using addEvent(_:at:).
The initializer of AVMIDINoteEvent
expects a MIDI key number between 0 and 127. This is something you cannot gather from the AHAP file, but that your implementation might want to provide. You are not restricted to a single MIDI event either, as one haptic event can map to multiple musical events.
To answer your original question, there is no mechanism in Core Haptics to go from a haptic to an audio or MIDI representation. If you have any suggestions for improving Core Haptics, including requests for sample code or documentation, please use Feedback Assistant to submit an enhancement request. From an API standpoint, you can drive a synthesizer using an AHAP file by converting the JSON-like information in the file to AVMIDINoteEvent
s, for example, but the MIDI key number you choose for each event is an implementation detail.