iPhone5 Images not loading - xcode4.5

I have developed an app in Xcode of type Universal . When I run the program it is displaying the Images properly in iPhone5 4 inch simulator. But on my device the Images are not loading.
if(UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone && [[UIScreen mainScreen] bounds].size.height * [UIScreen mainScreen].scale >= 1136)
{
image1.image = [UIImage imageNamed:#"Lifestyle12_iPhone5#2x.png"];
image2.image = [UIImage imageNamed:#"Lifestyle11_iPhone5#2x.png"];
image3.image = [UIImage imageNamed:#"Lifestyle10_iPhone5#2x.png"];
image4.image = [UIImage imageNamed:#"Lifestyle2_iPhone5#2x.png"];
image5.image = [UIImage imageNamed:#"LifeStyle1_iPhone5#2x.png"];
image6.image = [UIImage imageNamed:#"Lifestyle5_iPhone5#2x.png"];
image7.image = [UIImage imageNamed:#"APP_incipio_F38_iPhone5#2x.png"];
image8.image = [UIImage imageNamed:#"APP_incipio_BACKPACK_iPhone5#2x.png"];
}
Kindly tell me the problem.
Thanks in advance.

Did your carefully checked the file names? The device has a case sensitive file system

Related

How to get simple face detection working?

I am trying to get a simple example of face detection working with ML Kit on iOS. Here is excerpts of the Objective C code:
FIRVisionFaceDetectorOptions *faceDetectorOptions;
FIRVision *vision;
FIRVisionFaceDetector *faceDetector;
faceDetectorOptions = [[FIRVisionFaceDetectorOptions alloc] init];
faceDetectorOptions.performanceMode = FIRVisionFaceDetectorPerformanceModeAccurate;
faceDetectorOptions.landmarkMode = FIRVisionFaceDetectorLandmarkModeAll;
faceDetectorOptions.contourMode = FIRVisionFaceDetectorContourModeNone;
faceDetectorOptions.classificationMode = FIRVisionFaceDetectorClassificationModeAll;
faceDetectorOptions.minFaceSize = 0.1; // TODO: finalize this option value
vision = [FIRVision vision];
faceDetector = [vision faceDetectorWithOptions:faceDetectorOptions];
UIImage *staticImg = [UIImage imageNamed:#"sample.jpg"];
FIRVisionImage *visionImage = [[FIRVisionImage alloc] initWithImage:staticImg];
NSError* error = Nil;
NSArray<FIRVisionFace *> * faces = [faceDetector resultsInImage:visionImage error:&error];
NSLog(#"Synchronous result. error = %#, face count = %lu", error, faces.count);
The sample.jpg file is the following image downloaded and added as a resource to my Xcode project:
http://chwb.org/wp-content/uploads/2014/01/Theo_Janssen-Face1.jpg
The resultsInImage returns no error, but no faces either. It logs:
Synchronous result. error = (null), face count = 0
Am I doing something wrong?
I figured it out. The problem was I need to set the image metadata with orientation like this:
FIRVisionImageMetadata *imageMetadata = [FIRVisionImageMetadata new];
imageMetadata.orientation = [FcFaceDetector visionImageOrientationFromImageOrientation:uiImage.imageOrientation];
visionImage.metadata = imageMetadata;
+ (FIRVisionDetectorImageOrientation) visionImageOrientationFromImageOrientation:(UIImageOrientation)imageOrientation {
switch (imageOrientation) {
case UIImageOrientationUp:
return FIRVisionDetectorImageOrientationTopLeft;
case UIImageOrientationDown:
return FIRVisionDetectorImageOrientationBottomRight;
case UIImageOrientationLeft:
return FIRVisionDetectorImageOrientationLeftBottom;
case UIImageOrientationRight:
return FIRVisionDetectorImageOrientationRightTop;
case UIImageOrientationUpMirrored:
return FIRVisionDetectorImageOrientationTopRight;
case UIImageOrientationDownMirrored:
return FIRVisionDetectorImageOrientationBottomLeft;
case UIImageOrientationLeftMirrored:
return FIRVisionDetectorImageOrientationLeftTop;
case UIImageOrientationRightMirrored:
return FIRVisionDetectorImageOrientationRightBottom;
}
}
The docs seem to be unclear about it, because it seems to suggest to not set it:
https://firebase.google.com/docs/ml-kit/ios/detect-faces#2-run-the-face-detector

NSAttributedString drawRect doesn't draw images on-screen on Mojave

I have a working app that draws NSAttributedStrings into a custom view. The NSAttributedStrings can included embedded images. This works on versions of macOS prior to Mojave. The app can display the strings on screen, print them, and save them to image files.
This is apparently broken under Mojave. Weirdly, printing and saving to image files still works; but on-screen, the strings display only the text and not the embedded images. Proper space is left for the images, but that space is blank.
I've tested by building a small app that shows a window with an NSTextField (a label) and a custom view. It makes a single NSAttributedString with an embedded image. It applies that string to the attributedStringValue of the label, and also calls drawInRect: on the same string in the drawRect: method of the custom view. In the label, the string is displayed correctly, image and all. But in the custom view, only the text appears, and the space where the image should be is blank.
Anybody got a clue why this is happening on Mojave but not on earlier versions of macOS?
Here is the code that makes the string (and caches it, for re-use):
static NSMutableAttributedString* sgAttrString = nil;
/*
* Creates an attributed string the first time it's called,
* then returns that same string each time it's called.
*/
+ (NSAttributedString*)getAttributedString
{
if (sgAttrString == nil)
{
NSFont* font = [NSFont fontWithName:#"Helvetica" size:24.0];
NSDictionary *attrs = #{
NSFontAttributeName: font
};
sgAttrString = [[NSMutableAttributedString alloc] initWithString:#"Daisy: " attributes:attrs];
NSImage* daisy = [NSImage imageNamed:#"daisy.png"];
[daisy setSize:NSMakeSize(24,24)];
NSTextAttachment *attachment = [[NSTextAttachment alloc] init];
// I'm aware that attachment.image is available only on macOS 10.11 and later.
// It's not an issue in my real project.
attachment.image = daisy;
NSMutableAttributedString* imageStr = [[NSMutableAttributedString alloc] init];
[imageStr setAttributedString:[NSAttributedString attributedStringWithAttachment:attachment]];
[sgAttrString appendAttributedString:imageStr];
[sgAttrString appendAttributedString: [[NSAttributedString alloc] initWithString:#" !!" attributes:attrs]];
}
return sgAttrString;
}
Here is the code that applies the string to the NSTextField:
NSAttributedString* str = [Utilities getAttributedString];
self.label.attributedStringValue = str;
And here is the code that draws the string in a custom NSView:
NSAttributedString* str = [Utilities getAttributedString];
[str drawInRect:NSMakeRect(50,50, 300, 40)];
Again, this behavior seems to occur only in Mojave! Thanks in advance for any help.

How do I turn off the Accessibility Inspector in the iOS 9 simulator?

The accessibility inspector is turned on by my KIF tests (apparently it's necessary for KIF to work.) Problem is, its window occludes controls some subsequent UI tests need to tap on and those tests fail.
How can I turn the Accessibility Inspector off when my KIF tests are done with it so my UI Tests can run?
(Turning it off "manually" from the simulator's Settings app is not a solution—I'm looking for something I can call from code, set in the target or...?)
It is not on by default. You must turn it on manually.
I saw the following on Stew Gleadow's blog.
You just need to change the line:
CFPreferencesSetValue(CFSTR("ApplicationAccessibilityEnabled"), kCFBooleanFalse, accessibilityDomain, kCFPreferencesAnyUser, kCFPreferencesAnyHost);
change kCFBooleanTrue to kCFBooleanFalse.
+ (void)_enableAccessibilityInSimulator {
NSAutoreleasePool *autoreleasePool = [[NSAutoreleasePool alloc] init];
NSString *appSupportLocation = #"/System/Library/PrivateFrameworks/AppSupport.framework/AppSupport";
NSDictionary *environment = [[NSProcessInfo processInfo] environment];
NSString *simulatorRoot = [environment objectForKey:#"IPHONE_SIMULATOR_ROOT"];
if (simulatorRoot) {
appSupportLocation = [simulatorRoot stringByAppendingString:appSupportLocation];
}
void *appSupportLibrary = dlopen([appSupportLocation fileSystemRepresentation], RTLD_LAZY);
CFStringRef (*copySharedResourcesPreferencesDomainForDomain)(CFStringRef domain) = dlsym(appSupportLibrary, "CPCopySharedResourcesPreferencesDomainForDomain");
if (copySharedResourcesPreferencesDomainForDomain) {
CFStringRef accessibilityDomain = copySharedResourcesPreferencesDomainForDomain(CFSTR("com.apple.Accessibility"));
if (accessibilityDomain) {
CFPreferencesSetValue(CFSTR("ApplicationAccessibilityEnabled"), kCFBooleanFalse, accessibilityDomain, kCFPreferencesAnyUser, kCFPreferencesAnyHost);
CFRelease(accessibilityDomain);
}
}
[autoreleasePool drain];
}

AVPlayer volume is not adjusting while AirPlay

I am working on a custom audio player.
I am using a UISlider to adjust volume.It is working fine when I play on iPad.
But when I use AirPlay volume does not adjust.
Here is my code to adjust volume.
UISlider* slide = sender;
NSArray *audioTracks = [myPlayer.currentItem.asset tracksWithMediaType:AVMediaTypeAudio];
NSMutableArray *allAudioParams = [NSMutableArray array];
for (AVAssetTrack *track in audioTracks) {
AVMutableAudioMixInputParameters *audioInputParams =[AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
[audioInputParams setVolume:slide.value atTime:myPlayer.currentTime];
[audioInputParams setTrackID:[track trackID]];
[allAudioParams addObject:audioInputParams];
}
AVMutableAudioMix *audioZeroMix = [AVMutableAudioMix audioMix];
[audioZeroMix setInputParameters:allAudioParams];
[myPlayer.currentItem setAudioMix:audioZeroMix];

How can I change (modify) video frame rate and bit rate?

I need re-encoding a video file from photo library for web site service.
I tried below code but it has occurred error like 'video composition must have composition instructions'.
(code)
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoFileUrl options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) {
self.exportSession = [[AVAssetExportSession alloc]
initWithAsset:anAsset presetName:AVAssetExportPresetPassthrough];
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, anAsset.duration) ofTrack:[[anAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction,nil];
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.frameDuration = CMTimeMake(1, 30); // bit rate
MainCompositionInst.renderSize = CGSizeMake(640, 480); // frame rate
[self.exportSession setVideoComposition:MainCompositionInst];
NSURL *furl = [NSURL fileURLWithPath:self.tmpVideoPath];
self.exportSession.outputURL = furl;
self.exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(self.startTime, anAsset.duration.timescale);
CMTime duration = CMTimeMakeWithSeconds(self.stopTime-self.startTime, anAsset.duration.timescale);
CMTimeRange range = CMTimeRangeMake(start, duration);
self.exportSession.timeRange = range;
self.trimBtn.hidden = YES;
self.myActivityIndicator.hidden = NO;
[self.myActivityIndicator startAnimating];
[self.exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([self.exportSession status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[self.exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
default:
NSLog(#"NONE");
dispatch_async(dispatch_get_main_queue(), ^{
[self.myActivityIndicator stopAnimating];
self.myActivityIndicator.hidden = YES;
self.trimBtn.hidden = NO;
[self playMovie:self.tmpVideoPath];
});
break;
}
}];
}
}
without the changing frame rate and bit rate, it works perfectly.
Please give me any advise.
Thanks.
Framerate I'm still looking for, but bitrate has been solved by this drop in replacement for AVAssetExportSession: https://github.com/rs/SDAVAssetExportSession
Not sure but some things to look at. Unfortunatley the error messages dont tell you much in these cases in AVFoundation.
In general, I would make things simple and slowly add functionality.To start, make sure all layers start at zero and then end at the final duration. Do the same for the main composition. Invalid times may give you an error like this. For your instruction, make sure that it starts at the same time and ends at the same time too.

Resources