Use Vision framework to detect a graph in Swift

I would like to offer the functionality that the user aims the camera at a graph (including axes and scales) and the app detects the graph and the app replicates the graph using the image.

I have the whole camera setup finished with a AVCaptureSession, VNDetectContoursRequest, VNImageRequestHandler, etc.

However, now I get many many results so I guess I will now need to tell the image processing process what I am looking for. i.e. filter the VNContoursObservations.

I 'think' I first need to detect two perpendicular lines (the two axes). How do I do that? If I do not see them, I can just ignore that input and wait for the next VNContoursObservation.

When I found the axes of the graph, I will need to find the curve (graph) that I need to scan. Any tips on how I can find that curve and turn that curve into a bunch of coordinates?

Thanks! Wouter

Answered by DTS Engineer in 808203022

Please see the following WWDC video starting at 11:36 where the speaker talks about the Contour Detection APIs.

WWDC20 - Explore Computer Vision APIs

There, they talk about setting up VNDetectContourRequest objects and using VNContoursObservation results.

Accepted Answer

Please see the following WWDC video starting at 11:36 where the speaker talks about the Contour Detection APIs.

WWDC20 - Explore Computer Vision APIs

There, they talk about setting up VNDetectContourRequest objects and using VNContoursObservation results.

Use Vision framework to detect a graph in Swift
 
 
Q