I've experimented quite a bit with the new API designed to neutralize image colors using the iPhone flash, and I think the concept is brilliant. The flash could potentially serve as a substitute for a color checker, given our full control over it. However, I believe there are several areas where this API could be improved.
Firstly, the resulting images often appear "unattractive"—colors tend to look faded, and the images themselves can be overly bright and washed out, losing the natural ambiance, shadows, and introducing unwanted flash reflections. There is also inconsistency in color rendering; for example, yellows sometimes appear unnatural, possibly due to reflections. In some cases, all the colors in the image are completely desaturated or become black and white if another light source does not fully illuminate the scene. Additionally, the shadows cast by the flash don't correct the colors properly since they fall outside the flash's range.
I think many of these issues could be resolved if we had access to ProRAW images capturing both the ambient light (without flash) and the flash-illuminated scene. With these, we could use specific colors in the image as a reference, similar to a color checker, to create an ICC profile or color transformation matrix to adjust the image colors more globally. This approach could help retain the shadows from the ambient light while still correcting colors to a neutral tone.
Access to ProRAW data is crucial for this, as it would provide images without the saturation issues that can affect some colors and with a linear tone curve. I hope this suggestion makes sense and could help improve the API's effectiveness.