1
J'ai créé un filigrane qui affiche l'heure actuelle, mais il ne change pas à chaque seconde. Comment ferais-je en sorte que le texte change chaque seconde? Je veux passer du temps sur des vidéos comme des images de sécurité. Voici un code qui fonctionne bien pour ajouter du texte sur une vidéo.Comment ajouter un filigrane de texte à la vidéo qui change chaque seconde dans Swift 3?
func waterMark(){
let filePath: String = Bundle.main.path(forResource: "Zombie", ofType: "mp4")!
let videoAsset = AVURLAsset(url: URL(fileURLWithPath: filePath), options: nil)
let mixComposition = AVMutableComposition()
let compositionVideoTrack: AVMutableCompositionTrack? = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let clipVideoTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
try? compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: clipVideoTrack, at: kCMTimeZero)
compositionVideoTrack?.preferredTransform = videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0].preferredTransform
let videoSize: CGSize = clipVideoTrack.naturalSize
let aLayer = CALayer()
aLayer.contents = (Any).self
aLayer.frame = CGRect(x: videoSize.width - 65, y: videoSize.height - 75, width: 57, height: 57)
aLayer.opacity = 0.65
let parentLayer = CALayer()
let videoLayer = CALayer()
parentLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
videoLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
parentLayer.addSublayer(videoLayer)
parentLayer.addSublayer(aLayer)
let titleLayer = CATextLayer()
let dateFormatter = DateFormatter()
dateFormatter.timeStyle = .medium
// titleLayer.backgroundColor = UIColor.black.cgColor
titleLayer.string = String((dateFormatter.string(from: Date() as Date)))
titleLayer.font = UIFont.systemFont(ofSize: 100)
titleLayer.shadowOpacity = 0.5
titleLayer.frame = parentLayer.frame
titleLayer.display()
//You may need to adjust this for proper display
parentLayer.addSublayer(titleLayer as? CALayer ?? CALayer())
let videoComp = AVMutableVideoComposition()
videoComp.renderSize = videoSize
videoComp.frameDuration = CMTimeMake(1, 30)
videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration)
let videoTrack = mixComposition.tracks(withMediaType: AVMediaTypeVideo)[0]
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
instruction.layerInstructions = [layerInstruction]
videoComp.instructions = [instruction]
let assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
//AVAssetExportPresetPassthrough
assetExport?.videoComposition = videoComp
var paths: [Any] = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)
let documentsDirectory: String = paths[0] as? String ?? ""
let VideoName: String = "\(documentsDirectory)/mynewwatermarkedvideo.mp4"
let exportUrl = URL(fileURLWithPath: VideoName)
if FileManager.default.fileExists(atPath: VideoName) {
print(VideoName)
try? FileManager.default.removeItem(atPath: VideoName)
print("file found again")
}
assetExport?.outputFileType = AVFileTypeQuickTimeMovie
assetExport?.outputURL = exportUrl
assetExport?.shouldOptimizeForNetworkUse = true
//[strRecordedFilename setString: exportPath];
assetExport?.exportAsynchronously(completionHandler: {() -> Void in
DispatchQueue.main.async(execute: {() -> Void in
})
})
print("Completed")
}
///call
waterMark()
Salut MwcsMac, Merci pour votre réponse. Cette méthode fonctionnerait si je faisais le traitement en temps réel. Malheureusement, à ma connaissance, NSTimers ne fonctionne pas dans le rendu hors ligne. –