ios - Using VTCompressionSession as in WWDC2014 -
the documentation on library non-existent, need here.
goal: need h264 encoding (preferably both audio , video, video fine , i'll play around few days audio work too) can pass mpeg transport stream.
what have: have camera records , outputs sample buffers. inputs camera , built-in mic.
a few questions: a. possible camera output cmsamplebuffers in h264 format?i mean, 2014 has being produced vtcompressionsessions while writing captureoutput, see cmsamplebuffer... b. how set vtcompressionsession? how session used? overarching top-level discussion might people understand what's going on in barely documented library.
code here (please ask more if need it; i'm putting captureoutput because don't know how relevant rest of code is):
func captureoutput(captureoutput: avcaptureoutput!, didoutputsamplebuffer samplebuffer: cmsamplebuffer!, fromconnection connection: avcaptureconnection!) { println(cmsamplebuffergetformatdescription(samplebuffer)) var imagebuffer = cmsamplebuffergetimagebuffer(samplebuffer) if imagebuffer != nil { var pixelbuffer = imagebuffer cvpixelbufferref var timestamp = cmsamplebuffergetpresentationtimestamp(samplebuffer cmsamplebufferref) //do vtcompressionsession stuff } }
thanks all!
Comments
Post a Comment