2016-06-16 1 views
27

Je travaille sur l'utilisation d'un appareil sur mesure, et moi avons récemment mis à niveau vers la bêta Xcode 8 ainsi que Swift 3. J'avais d'abord ceci:Comment utiliser AVCapturePhotoOutput

var stillImageOutput: AVCaptureStillImageOutput? 

Cependant, je reçois maintenant l'avertissement :

'AVCaptureStillImageOutput' was deprecated in iOS 10.0: Use AVCapturePhotoOutput instead

Comme c'est assez récent, je n'ai pas vu beaucoup d'informations à ce sujet. Voici mon code actuel:

var captureSession: AVCaptureSession? 
var stillImageOutput: AVCaptureStillImageOutput? 
var previewLayer: AVCaptureVideoPreviewLayer? 

func clickPicture() { 

    if let videoConnection = stillImageOutput?.connection(withMediaType: AVMediaTypeVideo) { 

     videoConnection.videoOrientation = .portrait 
     stillImageOutput?.captureStillImageAsynchronously(from: videoConnection, completionHandler: { (sampleBuffer, error) -> Void in 

      if sampleBuffer != nil { 

       let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer) 
       let dataProvider = CGDataProvider(data: imageData!) 
       let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent) 

       let image = UIImage(cgImage: cgImageRef!, scale: 1, orientation: .right) 

      } 

     }) 

    } 

} 

J'ai essayé de regarder AVCapturePhotoCaptureDelegate, mais je ne suis pas tout à fait sûr comment l'utiliser. Est-ce que quelqu'un sait comment l'utiliser? Merci.

+0

Vous devez voir la vidéo WWDC 2016 session 511 .. –

+0

Ohk! Donc, je vais regarder la vidéo, et posterai une réponse si je peux. Merci! – penatheboss

+0

En regardant [les documents] (http://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput) pourrait aussi aider. – rickster

Répondre

39

Mis à jour à Swift 4 Salut c'est vraiment facile à utiliser AVCapturePhotoOutput. Vous avez besoin du AVCapturePhotoCaptureDelegate qui renvoie le CMSampleBuffer.

Vous pouvez obtenir ainsi une image d'aperçu si vous dire AVCapturePhotoSettings le previewFormat

class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate { 

     let cameraOutput = AVCapturePhotoOutput() 

     func capturePhoto() { 

      let settings = AVCapturePhotoSettings() 
      let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first! 
      let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType, 
           kCVPixelBufferWidthKey as String: 160, 
           kCVPixelBufferHeightKey as String: 160] 
      settings.previewPhotoFormat = previewFormat 
      self.cameraOutput.capturePhoto(with: settings, delegate: self) 

     } 

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {       
      if let error = error { 
       print(error.localizedDescription) 
      } 

      if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) { 
       print(image: UIImage(data: dataImage).size) // Your Image 
      } 

     } 

    } 

Pour de plus amples informations, visitez le https://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput

Note: Vous devez ajouter le AVCapturePhotoOutput au AVCaptureSession avant de prendre la photo . Donc quelque chose comme: session.addOutput(output), puis: output.capturePhoto(with:settings, delegate:self) Merci @BigHeadCreations

+7

Donne une erreur: "[AVCapturePhotoOutput capturePhotoWithSettings: delegate:] Aucune connexion vidéo active et activée". Pourriez-vous s'il vous plaît fournir un exemple complet pour iOS 10/Swift 3. –

+0

@TuomasLaatikainen vous devrez probablement définir la session de capture prédéfinie à AVCaptureSessionPresetPhoto –

+0

J'ai regardé la vidéo, navigué sur l'ensemble du Web, réécrit le code, changé d'iPhone et ne peux pas résoudre l'exception "Aucune connexion vidéo active et activée". Le document Apple est classiquement vague et vide de détails. Aidez-moi! Y a-t-il un projet de travail à partager ?? – mobibob

28

Il est ma pleine mise en œuvre

import UIKit 
import AVFoundation 

class ViewController: UIViewController, AVCapturePhotoCaptureDelegate { 

var captureSesssion : AVCaptureSession! 
var cameraOutput : AVCapturePhotoOutput! 
var previewLayer : AVCaptureVideoPreviewLayer! 

@IBOutlet weak var capturedImage: UIImageView! 
@IBOutlet weak var previewView: UIView! 

override func viewDidLoad() { 
    super.viewDidLoad() 
    captureSesssion = AVCaptureSession() 
    captureSesssion.sessionPreset = AVCaptureSessionPresetPhoto 
    cameraOutput = AVCapturePhotoOutput() 

    let device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo) 

    if let input = try? AVCaptureDeviceInput(device: device) { 
     if (captureSesssion.canAddInput(input)) { 
      captureSesssion.addInput(input) 
      if (captureSesssion.canAddOutput(cameraOutput)) { 
       captureSesssion.addOutput(cameraOutput) 
       previewLayer = AVCaptureVideoPreviewLayer(session: captureSesssion) 
       previewLayer.frame = previewView.bounds 
       previewView.layer.addSublayer(previewLayer) 
       captureSesssion.startRunning() 
      } 
     } else { 
      print("issue here : captureSesssion.canAddInput") 
     } 
    } else { 
     print("some problem here") 
    } 
} 

// Take picture button 
@IBAction func didPressTakePhoto(_ sender: UIButton) { 
    let settings = AVCapturePhotoSettings() 
    let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first! 
    let previewFormat = [ 
     kCVPixelBufferPixelFormatTypeKey as String: previewPixelType, 
     kCVPixelBufferWidthKey as String: 160, 
     kCVPixelBufferHeightKey as String: 160 
    ] 
    settings.previewPhotoFormat = previewFormat 
    cameraOutput.capturePhoto(with: settings, delegate: self) 
} 

// callBack from take picture 
func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) { 

    if let error = error { 
     print("error occure : \(error.localizedDescription)") 
    } 

    if let sampleBuffer = photoSampleBuffer, 
     let previewBuffer = previewPhotoSampleBuffer, 
     let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) { 
     print(UIImage(data: dataImage)?.size as Any) 

     let dataProvider = CGDataProvider(data: dataImage as CFData) 
     let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent) 
     let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right) 

     self.capturedImage.image = image 
    } else { 
     print("some error here") 
    } 
} 

// This method you can use somewhere you need to know camera permission state 
func askPermission() { 
    print("here") 
    let cameraPermissionStatus = AVCaptureDevice.authorizationStatus(forMediaType: AVMediaTypeVideo) 

    switch cameraPermissionStatus { 
    case .authorized: 
     print("Already Authorized") 
    case .denied: 
     print("denied") 

     let alert = UIAlertController(title: "Sorry :(" , message: "But could you please grant permission for camera within device settings", preferredStyle: .alert) 
     let action = UIAlertAction(title: "Ok", style: .cancel, handler: nil) 
     alert.addAction(action) 
     present(alert, animated: true, completion: nil) 

    case .restricted: 
     print("restricted") 
    default: 
     AVCaptureDevice.requestAccess(forMediaType: AVMediaTypeVideo, completionHandler: { 
      [weak self] 
      (granted :Bool) -> Void in 

      if granted == true { 
       // User granted 
       print("User granted") 
DispatchQueue.main.async(){ 
      //Do smth that you need in main thread 
      } 
      } 
      else { 
       // User Rejected 
       print("User Rejected") 

DispatchQueue.main.async(){ 
      let alert = UIAlertController(title: "WHY?" , message: "Camera it is the main feature of our application", preferredStyle: .alert) 
       let action = UIAlertAction(title: "Ok", style: .cancel, handler: nil) 
       alert.addAction(action) 
       self?.present(alert, animated: true, completion: nil) 
      } 
      } 
     }); 
    } 
} 
} 
+3

fonctionne parfaitement pour moi (test sur iPhone 7 plus) – aBikis

+0

Comment avez-vous défini flashMode? – coolly

+0

Travailler sur iOS 10.0.2. Pour activer le flash 'settings.flashMode = .on' –

2

La fonction délégué capture a été changé à photoOutput. Voici la fonction mise à jour pour Swift 4.

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {    
     if let error = error { 
      print(error.localizedDescription) 
     } 

     if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) { 
      print("image: \(String(describing: UIImage(data: dataImage)?.size))") // Your Image 
     } 
}