Black frames in resulting AVComposition.

I have multiple AVAssets with that I am trying to merge together into a single video track using AVComposition.

What I'm doing is iterating over my avassets and inserting them in to a single AVCompositionTrack like so:

- (AVAsset *)combineAssets
{
  // Create a mutable composition
  AVMutableComposition *composition = [AVMutableComposition composition];
  AVMutableCompositionTrack *compositionVideoTrack =
    [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
  AVMutableCompositionTrack *compositionAudioTrack =
    [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
  
  // Keep track of time offset
  CMTime currentOffset = kCMTimeZero;

  for (AVAsset *audioAsset in _audioSegments) {
    AVAssetTrack *audioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
    // Add the audio track to the composition audio track
    NSError *audioError;
    [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
                                   ofTrack:audioTrack
                                    atTime:currentOffset
                                     error:&audioError];
    if (audioError) {
      NSLog(@"Error combining audio track: %@", audioError.localizedDescription);
      return nil;
    }

    currentOffset = CMTimeAdd(currentOffset, audioAsset.duration);
  }

  // Reset offset to do the same with videos.
  currentOffset = kCMTimeZero;

  for (AVAsset *videoAsset in _videoSegments) {
    {
      // Get the video track
      AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];

      // Add the video track to the composition video track
      NSError *videoError;
      [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoTrack.timeRange.duration)
                                     ofTrack:videoTrack
                                      atTime:currentOffset
                                       error:&videoError];
      if (videoError) {
        NSLog(@"Error combining audio track: %@", videoError.localizedDescription);
        return nil;
      }

      // Increment current offset by asset duration
      currentOffset = CMTimeAdd(currentOffset, videoTrack.timeRange.duration);
    }
  }

  return composition;
}

The issue is that when I export the composition using an AVExportSession I notice that there's a black frame between the merged segments in the track.

In other words, if there were two 30second AVAssets merged into the composition track to create a 60 second video. You would see a black frame for a split second at the 30 second mark where the two assets combine.

I don't really want to re encode the assets I just want to stitch them together. How can I fix the black frame issue?.

It's possible that the duration video.timeRange is greater than the duration of the actual video frames in the movie. (One reason this can happen is when the audio is slightly longer than the video.)

You may have to do some exploration of the start and end times of the audio and video frames in each videoAsset, and come up with a different strategy for choosing a time range to insert for each video segment. (Also, if the audio composition is coming from the same assets as the video, you may have to do more work to ensure the audio and video timing matches.)

Black frames in resulting AVComposition.
 
 
Q