AVAssetExportSession add meta data after recording ends - avassetexportsession

The use case is like this: A video is recorded and saved at a temporary location using AVCaptureFileOutput. After the recording is completed some meta data is to be added to this video and saved with a new filename at a new location.
The recording part is working with the file getting stored at the temporary location. Now I have to rename it, add meta data and save it again to a different location.
1) Can I edit the meta data within the:
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error;
delagate method?
2) My second approach was to use AVAssetExportSession to do this.
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.metadata = NEW ARRAY OF METADATA;
NSString* outputPath = [[PLFileManager sharedFileManager] pathForAsset:_newAsset];
NSURL* url = [NSURL URLWithString:outputPath];
exportSession.outputURL = url;
[[NSFileManager defaultManager] removeItemAtURL:url error:nil];
[exportSession exportAsynchronouslyWithCompletionHandler:^(void){
NSLog(#"Exported to [%#] %#", exportSession.outputURL, exportSession.error);
}];
How ever with this approach I am getting the following error:
Exported to [///var/mobile/Applications/7F9BC121-6F58-436E-8DBE-33D8BC1A4D79/Documents/Temp/final.mov] Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x1555f440 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x1555c7a0 "The operation couldn’t be completed. (OSStatus error -12780.)", NSLocalizedFailureReason=An unknown error occurred (-12780)}
Can someone tell me what I am doing wrong here? Or is there a better way to do this?

Ok I figured out what I was doing wrong. Instead of:
NSURL* url = [NSURL URLWithString:outputPath];
I had to use:
NSURL* outputUrl = [NSURL fileURLWithPath:outputPath];
Expanation

Related

Firebase FQuery how do you detect when at the end of a list of nodes

How do I detect when I have finished processing all found nodes when doing a query? In the following example, I do some processing on each encountered node. When I reach the "end" of the list I would like to be able to detect this so I know it's finished.
FQuery* messageListQuery = [m_firebaseRef queryLimitedToNumberOfChildren:100];
[messageListQuery observeEventType:FEventTypeChildAdded andPreviousSiblingNameWithBlock:^(FDataSnapshot *snapshot, NSString *prevNodeName) {
// 1. Do interesting stuff with the snapshot data
// 2. I want to detect when I'm at the end of the list so I know when I'm done processing the list.
}];
Here is the example use case. I would like to load the latest 100 messages in the background. Once the messages have been loaded, I would like to update the UI. However, I'm not sure how I know all the messages have been loaded given there might be less then 100 messages in the list.
I figured out how to read all the messages up front by using the observeSingleEventOfType and then iterating over the children.
[m_firebaseRef observeSingleEventOfType:FEventTypeValue withBlock:^(FDataSnapshot *snapshot) {
NSLog( #"Name %# with %d children.", snapshot.name, snapshot.childrenCount );
for( FDataSnapshot *child in snapshot.children )
{
NSDictionary *msgData = child.value;
NSString *message = msgData[kFirebaseLiveChatFieldMessage];
NSString *gamerTag = msgData[kFirebaseLiveChatFieldGamerTag];
NSString *gameCenterId = msgData[kFirebaseLiveChatFieldGameCenterId];
NSLog( #"Preload = %# (%#): %#", gamerTag, gameCenterId, message );
}
}];

How can I change (modify) video frame rate and bit rate?

I need re-encoding a video file from photo library for web site service.
I tried below code but it has occurred error like 'video composition must have composition instructions'.
(code)
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoFileUrl options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) {
self.exportSession = [[AVAssetExportSession alloc]
initWithAsset:anAsset presetName:AVAssetExportPresetPassthrough];
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, anAsset.duration) ofTrack:[[anAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction,nil];
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.frameDuration = CMTimeMake(1, 30); // bit rate
MainCompositionInst.renderSize = CGSizeMake(640, 480); // frame rate
[self.exportSession setVideoComposition:MainCompositionInst];
NSURL *furl = [NSURL fileURLWithPath:self.tmpVideoPath];
self.exportSession.outputURL = furl;
self.exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(self.startTime, anAsset.duration.timescale);
CMTime duration = CMTimeMakeWithSeconds(self.stopTime-self.startTime, anAsset.duration.timescale);
CMTimeRange range = CMTimeRangeMake(start, duration);
self.exportSession.timeRange = range;
self.trimBtn.hidden = YES;
self.myActivityIndicator.hidden = NO;
[self.myActivityIndicator startAnimating];
[self.exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([self.exportSession status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[self.exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
default:
NSLog(#"NONE");
dispatch_async(dispatch_get_main_queue(), ^{
[self.myActivityIndicator stopAnimating];
self.myActivityIndicator.hidden = YES;
self.trimBtn.hidden = NO;
[self playMovie:self.tmpVideoPath];
});
break;
}
}];
}
}
without the changing frame rate and bit rate, it works perfectly.
Please give me any advise.
Thanks.
Framerate I'm still looking for, but bitrate has been solved by this drop in replacement for AVAssetExportSession: https://github.com/rs/SDAVAssetExportSession
Not sure but some things to look at. Unfortunatley the error messages dont tell you much in these cases in AVFoundation.
In general, I would make things simple and slowly add functionality.To start, make sure all layers start at zero and then end at the final duration. Do the same for the main composition. Invalid times may give you an error like this. For your instruction, make sure that it starts at the same time and ends at the same time too.

youtube api upload "Cannot close stream until all bytes are written"

I am using the youtube api to direct upload videos like this:
YouTubeRequestSettings setting = new YouTubeRequestSettings("devpa", key, "user", "pass");
YouTubeRequest req = new YouTubeRequest(setting);
Video ytv = new Video();
ytv.Title = "test video1";
ytv.Tags.Add(new MediaCategory("Autos", YouTubeNameTable.CategorySchema));
ytv.Keywords = "test, dev";
ytv.Description = "this is a test video";
ytv.YouTubeEntry.Private = true;
ytv.YouTubeEntry.MediaSource = new MediaFileSource(Server.MapPath("PATH"), "video/mp4");
Video createdVideo = req.Upload(ytv);
But every time i get this error:
Cannot close stream until all bytes are written
Even though *i am uploading small videos with different extinctions (flv,mp4 ..etc) *, what is the problem?
Thanks
You need to set timeout. Like ytv.TimeOut = 100000000;
Actually set the instance of
YouTubeRequestSettings.Timeout = [A LARGER NUMBER]
That's the setting you'll want to use.

Core Data: fetching items is slow with predicate

For my iPhone application I set up a data model for Core Data. It contains one entity Words and its attributes are language : String, length : Integer16 and word : String.
I prefilled my model's SQLite database with a word list (200k items) writing a separate iPhone application using the identical data model and coping the filled database to the main application.
Now using NSFetchedRequest I can query for managed objects as I like, but the results come in slow. I use the following method:
- (NSString *)getRandomWordLengthMin:(int)minLength max:(int)maxLength
{
NSString *word = #"";
MyAppDelegate *appDelegate = [[UIApplication sharedApplication] delegate];
NSManagedObjectContext *context = [appDelegate managedObjectContext];
NSFetchRequest *fetchRequest = [[NSFetchRequest alloc] init];
NSEntityDescription *entity = [NSEntityDescription entityForName:#"Words"
inManagedObjectContext:context];
[fetchRequest setEntity:entity];
NSString *predicateString = #"length >= %d AND length <= %d";
NSPredicate *predicate = [NSPredicate predicateWithFormat:predicateString,
minLength, maxLength];
[fetchRequest setPredicate:predicate];
NSError *error = nil;
int entityCount = [context countForFetchRequest:fetchRequest error:&error];
[fetchRequest setFetchLimit:1];
if(entityCount != 0)
{
[fetchRequest setFetchOffset:arc4random()%entityCount];
}
NSArray *fetchedObjects = [context executeFetchRequest:fetchRequest error:&error];
if([fetchedObjects count] != 0)
{
Words * test = [fetchedObjects objectAtIndex:0];
word = [NSString stringWithFormat:#"%#", [test word]];
}
return word;
}
Using an SQLite editor I already set an index manually on column zLength, but this didn't bring any speedup. Where is the bottleneck?
EDIT:
I figured out that getting int entityCount = ... is slow. But even getting all objects and then selecting one random word is slow:
Words * test = [fetchedObjects objectAtIndex:arc4random()%[fetchedObjects count]];
You are effectively running two fetches here, one to get the fetch count and then one to fetch the actual object. That will slow things down.
Your predicate is "backwards." Compound predicates evaluate the first expression e.g. length >= %d and then evaluate the second e.g. length <= %d only against the results of the first. Therefore you should put the test that eliminates the most objects first. In this case, length <= %d probably eliminates more objects so it should come first in the predicate.
Since you don't actually need the entire Words managed object but just the word string, you can set the fetch return type to NSDictionaryResultType and then set the property to fetch to just the word attribute. That will speed things up considerably.
Part of your problem here is that Core Data is designed to managed a structured object graph and you are using a random/unstructured graph so you are cutting against the grain of Core Data's optimizations.
Do not use the SQLite editor to edit the SQLite backing store for a Core Data storage. The internals of the database is private and subject to change.
Instead go the the model editor in Xcode and simply put a checkmark on the "indexed" option for the entity attribute you want indexed.
Not sure but maybe this predicate is easier to optimize:
NSString *predicateString = #"length BETWEEN (%d, %d)";

CoreData - NSPredicate formated with time range (NSTimeInterval)

I'm trying to find out how to go through my CoreData information and find objects that have a createdAt (part of my object as an NSDate) that is within a NSTimeInterval. How do I set this up?
I've looked on the documentation at:
http://developer.apple.com/documentation/Cocoa/Conceptual/Predicates/predicates.html
But I'm not finding anything there.
Do I need to create two time stamps and use SQL's BETWEEN?
Any help would be wonderful.
First of all, it doesn't make sense to check if an NSDate is within an NSTimeInterval, because NSTimeInterval just specifies a length of time, not its location. Instead, you want to use two separate NSDates specifying the beginning and end of your intervals.
Here's what it would look like (beginningTime and endTime are NSDates).
NSFetchRequest *request = [[NSFetchRequest alloc] init];
request.entity = [NSEntityDescription entityForName:#"YourEntityName" inManagedObjectContext:yourContext];
NSPredicate *beginningPredicate = [NSPredicate predicateWithFormat:#"createdAt >= %#", beginningTime];
NSPredicate *endPredicate = [NSPredicate predicateWithFormat:#"createdAt <= %#", endTime];
request.predicate = [NSCompoundPredicate andPredicateWithSubpredicates:[NSArray arrayWithObjects:beginningPredicate, endPredicate, nil]];
NSArray *results = [yourContext executeFetchRequest:request error:NULL];

Resources