How can I change (modify) video frame rate and bit rate? - avassetexportsession

I need re-encoding a video file from photo library for web site service.
I tried below code but it has occurred error like 'video composition must have composition instructions'.
(code)
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoFileUrl options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) {
self.exportSession = [[AVAssetExportSession alloc]
initWithAsset:anAsset presetName:AVAssetExportPresetPassthrough];
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, anAsset.duration) ofTrack:[[anAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction,nil];
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.frameDuration = CMTimeMake(1, 30); // bit rate
MainCompositionInst.renderSize = CGSizeMake(640, 480); // frame rate
[self.exportSession setVideoComposition:MainCompositionInst];
NSURL *furl = [NSURL fileURLWithPath:self.tmpVideoPath];
self.exportSession.outputURL = furl;
self.exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(self.startTime, anAsset.duration.timescale);
CMTime duration = CMTimeMakeWithSeconds(self.stopTime-self.startTime, anAsset.duration.timescale);
CMTimeRange range = CMTimeRangeMake(start, duration);
self.exportSession.timeRange = range;
self.trimBtn.hidden = YES;
self.myActivityIndicator.hidden = NO;
[self.myActivityIndicator startAnimating];
[self.exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([self.exportSession status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[self.exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
default:
NSLog(#"NONE");
dispatch_async(dispatch_get_main_queue(), ^{
[self.myActivityIndicator stopAnimating];
self.myActivityIndicator.hidden = YES;
self.trimBtn.hidden = NO;
[self playMovie:self.tmpVideoPath];
});
break;
}
}];
}
}
without the changing frame rate and bit rate, it works perfectly.
Please give me any advise.
Thanks.

Framerate I'm still looking for, but bitrate has been solved by this drop in replacement for AVAssetExportSession: https://github.com/rs/SDAVAssetExportSession

Not sure but some things to look at. Unfortunatley the error messages dont tell you much in these cases in AVFoundation.
In general, I would make things simple and slowly add functionality.To start, make sure all layers start at zero and then end at the final duration. Do the same for the main composition. Invalid times may give you an error like this. For your instruction, make sure that it starts at the same time and ends at the same time too.

Related

How to get simple face detection working?

I am trying to get a simple example of face detection working with ML Kit on iOS. Here is excerpts of the Objective C code:
FIRVisionFaceDetectorOptions *faceDetectorOptions;
FIRVision *vision;
FIRVisionFaceDetector *faceDetector;
faceDetectorOptions = [[FIRVisionFaceDetectorOptions alloc] init];
faceDetectorOptions.performanceMode = FIRVisionFaceDetectorPerformanceModeAccurate;
faceDetectorOptions.landmarkMode = FIRVisionFaceDetectorLandmarkModeAll;
faceDetectorOptions.contourMode = FIRVisionFaceDetectorContourModeNone;
faceDetectorOptions.classificationMode = FIRVisionFaceDetectorClassificationModeAll;
faceDetectorOptions.minFaceSize = 0.1; // TODO: finalize this option value
vision = [FIRVision vision];
faceDetector = [vision faceDetectorWithOptions:faceDetectorOptions];
UIImage *staticImg = [UIImage imageNamed:#"sample.jpg"];
FIRVisionImage *visionImage = [[FIRVisionImage alloc] initWithImage:staticImg];
NSError* error = Nil;
NSArray<FIRVisionFace *> * faces = [faceDetector resultsInImage:visionImage error:&error];
NSLog(#"Synchronous result. error = %#, face count = %lu", error, faces.count);
The sample.jpg file is the following image downloaded and added as a resource to my Xcode project:
http://chwb.org/wp-content/uploads/2014/01/Theo_Janssen-Face1.jpg
The resultsInImage returns no error, but no faces either. It logs:
Synchronous result. error = (null), face count = 0
Am I doing something wrong?
I figured it out. The problem was I need to set the image metadata with orientation like this:
FIRVisionImageMetadata *imageMetadata = [FIRVisionImageMetadata new];
imageMetadata.orientation = [FcFaceDetector visionImageOrientationFromImageOrientation:uiImage.imageOrientation];
visionImage.metadata = imageMetadata;
+ (FIRVisionDetectorImageOrientation) visionImageOrientationFromImageOrientation:(UIImageOrientation)imageOrientation {
switch (imageOrientation) {
case UIImageOrientationUp:
return FIRVisionDetectorImageOrientationTopLeft;
case UIImageOrientationDown:
return FIRVisionDetectorImageOrientationBottomRight;
case UIImageOrientationLeft:
return FIRVisionDetectorImageOrientationLeftBottom;
case UIImageOrientationRight:
return FIRVisionDetectorImageOrientationRightTop;
case UIImageOrientationUpMirrored:
return FIRVisionDetectorImageOrientationTopRight;
case UIImageOrientationDownMirrored:
return FIRVisionDetectorImageOrientationBottomLeft;
case UIImageOrientationLeftMirrored:
return FIRVisionDetectorImageOrientationLeftTop;
case UIImageOrientationRightMirrored:
return FIRVisionDetectorImageOrientationRightBottom;
}
}
The docs seem to be unclear about it, because it seems to suggest to not set it:
https://firebase.google.com/docs/ml-kit/ios/detect-faces#2-run-the-face-detector

watchkit WKAlertAction openSystemURL

I am trying to show the User different Phone Numbers on Apple Watch and he clicks on one than phone call alert should appear. I'll do it like this but the Alert is just dismissed without call action:
NSMutableArray *tempArray = [[NSMutableArray alloc] initWithCapacity:0];
WKExtension *myExt = [WKExtension sharedExtension];
for (NSString *phone in arr) {
NSString *tel = [NSString stringWithFormat:#"tel:%#",phone];
WKAlertAction *act = [WKAlertAction actionWithTitle:tel style:WKAlertActionStyleDefault handler:^(void){
[myExt openSystemURL:[NSURL URLWithString:phone1]];
}];
[tempArray addObject:act];
}
NSString *titleMessage = #"Call";
NSString *textMessage = #"Please select the number you want to call.";
NSString *cancel = #"Cancel";
WKAlertAction *act = [WKAlertAction actionWithTitle:cancel style:WKAlertActionStyleDestructive handler:^(void){
}];
[tempArray addObject:act];
[self presentAlertControllerWithTitle:titleMessage message:textMessage preferredStyle:WKAlertControllerStyleAlert actions:tempArray];
Buttons are shown as expected and the Handler is also called with the correct Phone Number. But it does not openSystemURL. Does somebody know why and how to fix? Thanks!
I think you forgot to add "tel" scheme ,Use below code :
[WKAlertAction actionWithTitle:#"tel" style:WKAlertActionStyleDefault handler:^(void){
[[WKExtension sharedExtension]openSystemURL:[NSURL URLWithString:[NSString stringWithFormat:#"tel:%#",#"YOUR NUMBER"]]];
}];
About Apple URL Schemes

Firebase FQuery how do you detect when at the end of a list of nodes

How do I detect when I have finished processing all found nodes when doing a query? In the following example, I do some processing on each encountered node. When I reach the "end" of the list I would like to be able to detect this so I know it's finished.
FQuery* messageListQuery = [m_firebaseRef queryLimitedToNumberOfChildren:100];
[messageListQuery observeEventType:FEventTypeChildAdded andPreviousSiblingNameWithBlock:^(FDataSnapshot *snapshot, NSString *prevNodeName) {
// 1. Do interesting stuff with the snapshot data
// 2. I want to detect when I'm at the end of the list so I know when I'm done processing the list.
}];
Here is the example use case. I would like to load the latest 100 messages in the background. Once the messages have been loaded, I would like to update the UI. However, I'm not sure how I know all the messages have been loaded given there might be less then 100 messages in the list.
I figured out how to read all the messages up front by using the observeSingleEventOfType and then iterating over the children.
[m_firebaseRef observeSingleEventOfType:FEventTypeValue withBlock:^(FDataSnapshot *snapshot) {
NSLog( #"Name %# with %d children.", snapshot.name, snapshot.childrenCount );
for( FDataSnapshot *child in snapshot.children )
{
NSDictionary *msgData = child.value;
NSString *message = msgData[kFirebaseLiveChatFieldMessage];
NSString *gamerTag = msgData[kFirebaseLiveChatFieldGamerTag];
NSString *gameCenterId = msgData[kFirebaseLiveChatFieldGameCenterId];
NSLog( #"Preload = %# (%#): %#", gamerTag, gameCenterId, message );
}
}];

OCMock is only effective once, strange, why? Or what's wrong on my side?

I want to do mock for TnSettings, yes, it works if code by the following method, the problem is that we need to do write mock code for each case, if we only mock once then execute more than one case, then the second will report exception. I use the latest OCMock V2.01.
My question is that why OCMock has such restriction? Or is it my fault not to use it correctly?
Any idea or discussion will be appreciated, thanks in advance.
- (void) testFormattedDistanceValueWithMeters {
mockSettings = [OCMockObject mockForClass:[TnSettings class]];
mockClientModel = [TnClientModel createMockClientModel];
[[[mockClientModel expect] andReturn:mockSettings] settings];
[[[mockSettings expect] andReturn:[NSNumber numberWithInt:0]] preferencesGeneralUnits];
NSNumber *meters = [NSNumber numberWithDouble:0.9];
distance = [NSString formattedDistanceValueWithMeters:meters];
STAssertEqualObjects(distance, #"0.9", #"testformattedEndTimeForTimeInSeconds failed");
//------------- Another case -----------------
mockSettings = [OCMockObject mockForClass:[TnSettings class]];
mockClientModel = [TnClientModel createMockClientModel];
[[[mockClientModel expect] andReturn:mockSettings] settings];
[[[mockSettings expect] andReturn:[NSNumber numberWithInt:0]] preferencesGeneralUnits];
meters = [NSNumber numberWithDouble:100.9];
distance = [NSString formattedDistanceValueWithMeters:meters];
STAssertEqualObjects(distance, #"101", #"testformattedEndTimeForTimeInSeconds failed");
}
Not sure I understand your question or your code fully. I suspect that you stumbled over the difference between expect and stub, though.
Is this what you had in mind?
- (void) testFormattedDistanceValueWithMeters {
mockSettings = [OCMockObject mockForClass:[TnSettings class]];
mockClientModel = [TnClientModel createMockClientModel];
[[[mockClientModel stub] andReturn:mockSettings] settings];
[[[mockSettings stub] andReturn:[NSNumber numberWithInt:0]] preferencesGeneralUnits];
NSNumber *meters = [NSNumber numberWithDouble:0.9];
distance = [NSString formattedDistanceValueWithMeters:meters];
STAssertEqualObjects(distance, #"0.9", #"testformattedEndTimeForTimeInSeconds failed");
meters = [NSNumber numberWithDouble:100.9];
distance = [NSString formattedDistanceValueWithMeters:meters];
STAssertEqualObjects(distance, #"101", #"testformattedEndTimeForTimeInSeconds failed");
}

CoreData - NSPredicate formated with time range (NSTimeInterval)

I'm trying to find out how to go through my CoreData information and find objects that have a createdAt (part of my object as an NSDate) that is within a NSTimeInterval. How do I set this up?
I've looked on the documentation at:
http://developer.apple.com/documentation/Cocoa/Conceptual/Predicates/predicates.html
But I'm not finding anything there.
Do I need to create two time stamps and use SQL's BETWEEN?
Any help would be wonderful.
First of all, it doesn't make sense to check if an NSDate is within an NSTimeInterval, because NSTimeInterval just specifies a length of time, not its location. Instead, you want to use two separate NSDates specifying the beginning and end of your intervals.
Here's what it would look like (beginningTime and endTime are NSDates).
NSFetchRequest *request = [[NSFetchRequest alloc] init];
request.entity = [NSEntityDescription entityForName:#"YourEntityName" inManagedObjectContext:yourContext];
NSPredicate *beginningPredicate = [NSPredicate predicateWithFormat:#"createdAt >= %#", beginningTime];
NSPredicate *endPredicate = [NSPredicate predicateWithFormat:#"createdAt <= %#", endTime];
request.predicate = [NSCompoundPredicate andPredicateWithSubpredicates:[NSArray arrayWithObjects:beginningPredicate, endPredicate, nil]];
NSArray *results = [yourContext executeFetchRequest:request error:NULL];

Resources