AVAssetReader kills playback (in AVAudioPlayer) - avaudioplayer

I am using AVAssetReader to read ipod library asset audio data and render a waveform image. this takes place using code I have described in my answer to this question
this sometimes takes place while audio is being played by an instance of AVAudioPlayer.
regardless of wether the audio being played is the same asset that is being read, the moment i hit
[reader startReading];
the audio being played "fades out". (as if the AVAudioPlayer has somehow been told to stop playback). This is odd, as I am not actually playing the audio, just reading it.
I did a search on SO and found this possible solution however i have found that this does not appear to solve the problem.
note - I am able to have several instances of AVAudioPlayer playing, and starting these do not seem to interfere with each other - however
[reader startReading];
will even kill multiple simultaneous instances of AVAudioPlayer, causing them all to synchronously fade out.
any ideas?

answering my own question....
further searching on SO led me to implementing this alternate solution:
- (void)setupAudio {
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: nil];
UInt32 doSetProperty = 1;
AudioSessionSetProperty (kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(doSetProperty), &doSetProperty);
[[AVAudioSession sharedInstance] setActive: YES error: nil];
}
this was gleaned from here
**EDIT **UPDATED****
I have since made this into a class that also pre-initialises the audio queue (useful in both simulator and device as it eliminates the startup lag from the playback of the first audio file.
you can find the point1sec.mp3 here: http://www.xamuel.com/blank-mp3s/
#import <AVFoundation/AVFoundation.h>
#import "AudioToolbox/AudioServices.h"
#interface sw_AVAudioPlayerSetup : NSObject
<AVAudioPlayerDelegate> {
}
+ (void)setupAudio ;
+ (void)setupSharedSession ;
#end
#implementation sw_AVAudioPlayerSetup
+ (void)setupSharedSession {
static BOOL audioSessionSetup = NO;
if (audioSessionSetup) {
return;
}
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: nil];
UInt32 doSetProperty = 1;
AudioSessionSetProperty (kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(doSetProperty), &doSetProperty);
[[AVAudioSession sharedInstance] setActive: YES error: nil];
audioSessionSetup = YES;
}
+ (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag{
// delegate callback to release player
[player release];
}
+ (void)setupAudio {
[self setupSharedSession];
NSString *filepath = [[NSBundle mainBundle]
pathForResource:#"point1sec"
ofType:#"mp3"];
if ([[NSFileManager defaultManager] fileExistsAtPath:filepath]) {
AVAudioPlayer* player = [[AVAudioPlayer alloc]
initWithContentsOfURL:
[NSURL fileURLWithPath:filepath]
error:nil];
player.delegate = (id <AVAudioPlayerDelegate>) self;
[player play];
}
}

Related

How to upload an image file in a background session (iOS)?

I am unable to upload an image from my device's Photos in a background session. When I call [session uploadTaskWithRequest:req fromFile:nsurl] the system immediately complains by sending
Failed to issue sandbox extension for file file:///var/mobile/Media/DCIM/103APPLE/IMG_3984.JPG, errno = 1
to the console. (A similar Stack Overflow issue is here)
However, if I create my NSURLSessionConfiguration with [NSURLSessionConfiguration defaultSessionConfiguration] (instead of [NSURLSessionConfiguration backgroundSessionConfigurationWithIdentifier:id], which I need) and if I construct an NSData object out of the NSURL and upload that instead of uploading straight from a file (which is required by a background session) then the upload succeeds. Btw I'm uploading files into our Rackspace Cloud account, and am able to do this successfully with a simple Postman PUT.
The problem occurs in my uploadObject method, which looks like:
-(void) uploadObject:(NSURL*)urlToBeUploadedIn
{
NSDictionary *headers = #{ #"x-auth-token": state.tokenID,
#"content-type": #"image/jpeg",
#"cache-control": #"no-cache" };
// create destination url for Rackspace cloud upload
NSString *sURL = [NSString stringWithFormat:#"%#/testing_folder/%#.jpg", state.publicURL, [state generateImageName]];
NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:[NSURL URLWithString:sURL]
cachePolicy:NSURLRequestUseProtocolCachePolicy
timeoutInterval:10.0];
[request setHTTPMethod:#"PUT"];
[request setAllHTTPHeaderFields:headers];
self.sessionIdentifier = [NSString stringWithFormat:#"my-background-session"];
// NSURLSessionConfiguration *configuration = [NSURLSessionConfiguration defaultSessionConfiguration]; // when I use this instead of the line below the sandbox error goes away
NSURLSessionConfiguration *configuration = [NSURLSessionConfiguration backgroundSessionConfigurationWithIdentifier:self.sessionIdentifier];
self.session = [NSURLSession sessionWithConfiguration:configuration delegate:self delegateQueue:nil];
NSURLSessionUploadTask *uploadTask = [self.session uploadTaskWithRequest:request fromFile:urlToBeUploadedIn];
[uploadTask resume];
}
My call that invokes uploadObject: looks like:
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithLocalIdentifiers:state.arrImagesToBeUploaded options:nil];
PHAsset *phAsset = [fetchResult objectAtIndex:0]; // yes the 0th item in array is guaranteed to exist, above.
[phAsset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL *imageURL = contentEditingInput.fullSizeImageURL;
[self uploadObject:imageURL];
}];
Btw I first validate the NSURL I send to uploadObject: with a call to fileExistsAtPath: so I know my reference to the file is good. Finally, my delegate calls
(void)URLSession:(NSURLSession *)session dataTask:(NSURLSessionDataTask *)dataTask didReceiveData:(NSData *)dataIn
(void)URLSession:(NSURLSession *)session task:(NSURLSessionTask *)task didCompleteWithError:(NSError *)error does get invoked, although the data
do get called by the server (although the response won't parse) so, I am getting something back from the server which never correctly receives the image.
A solution is to first copy the image to be uploaded into the app's sandbox. I used:
NSError *err;
BOOL bVal = [myNSDataOb writeToURL:myDestinationURL options:0 error:&err];
and copied into my app's 'Documents' directory, but one might also use:
NSError *err;
BOOL bVal = [[NSFileManager defaultManager] copyItemAtURL:myImageURL toURL:myDestinationURL error:&err];

Error Domain=CBErrorDomain Code=7 "The specified device has disconnected from us

We need Get Data form BluetooothDevices to IOS devices.We are using core_bluetooth.frameworks. didDisconnectPeripheral call after didConnectPeripheral ever time.We got error is
error is Error Domain=CBErrorDomain Code=7 "The specified device has disconnected from us." UserInfo={NSLocalizedDescription=The specified device has disconnected from us.}
We tried code is:
//in ViewDidLoad
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:NO], CBCentralManagerOptionShowPowerAlertKey, nil];
self.central=[[CBCentralManager alloc] initWithDelegate:self queue:nil options:options];
//self.central=[[CBCentralManager alloc] initWithDelegate:self queue:dispatch_get_global_queue(QOS_CLASS_BACKGROUND, 0)];
self.discoveredPeripherals=[NSMutableArray new];
//Bluetooth on/off
-(void) centralManagerDidUpdateState:(CBCentralManager *)central {
NSString *stateString = nil;
switch(central.state)
{
case CBCentralManagerStateResetting:
stateString = #"The connection with the system service was momentarily lost, update imminent.";
break;
case CBCentralManagerStateUnsupported:
stateString = #"The platform doesn't support Bluetooth Low Energy.";
break;
case CBCentralManagerStateUnauthorized:
stateString = #"The app is not authorized to use Bluetooth Low Energy.";
break;
case CBCentralManagerStatePoweredOff:
stateString = #"Bluetooth is currently powered off.";
break;
case CBCentralManagerStatePoweredOn:
stateString = #"Bluetooth is currently powered on and available to use.";
}
}
//Discover
NSLog(#"Discovered peripheral %# (%#)",peripheral.name,peripheral.identifier.UUIDString);
if (![self.discoveredPeripherals containsObject:peripheral] ) {
dispatch_async(dispatch_get_main_queue(), ^{
[self.discoveredPeripherals addObject:peripheral];
[self.tableview insertRowsAtIndexPaths:#[[NSIndexPath indexPathForRow:self.discoveredPeripherals.count-1 inSection:0]] withRowAnimation:UITableViewRowAnimationLeft];
});
}
//didConnectPeripheral
[self.activePeripheral discoverServices:#[[CBUUID UUIDWithString:#"00001c00-d102-11e1-9b23-00025b00a5a5"]]];
//didDisconnectPeripheral
NSLog(#"error is %#",error.description);
We are not understand why it's ever time call didDisconnection after didConnectPeripheral. Please tell me what wrong in my code.

Watchkit , openParentApplication with WatchKit Extension

First times doesn't work "Null"( before open App in iPhone )
and some times doesn't work but i want one loop or timer for repeat this request for get result :
here is my code
- (void)application:(UIApplication *)application handleWatchKitExtensionRequest:(NSDictionary *)userInfo reply:(void (^)(NSDictionary *))reply
{
// Temporary fix, I hope.
// --------------------
__block UIBackgroundTaskIdentifier bogusWorkaroundTask;
bogusWorkaroundTask = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{
[[UIApplication sharedApplication] endBackgroundTask:bogusWorkaroundTask];
}];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(2 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[[UIApplication sharedApplication] endBackgroundTask:bogusWorkaroundTask];
});
// --------------------
__block UIBackgroundTaskIdentifier realBackgroundTask;
realBackgroundTask = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{
reply(nil);
[[UIApplication sharedApplication] endBackgroundTask:realBackgroundTask];
}];
// Kick off a network request, heavy processing work, etc.
// Return any data you need to, obviously.
// reply(nil);
reply(#{#"Confirmation" : #"Text was received."});
[[UIApplication sharedApplication] endBackgroundTask:realBackgroundTask];
// NSLog(#"User Info: %#", userInfo);
}
Watch App Code
- (void)willActivate {
// This method is called when watch view controller is about to be visible to user
[super willActivate];
NSDictionary *dictionary = [[NSDictionary alloc] initWithObjectsAndKeys:#"MyCamande", #"OK", nil];
[InterfaceController openParentApplication:dictionary reply:^(NSDictionary *replyInfo, NSError *error) {
NSLog(#"Reply received by Watch app: %#", replyInfo);
}];
}
how can recall for get finally result
Well, I would not recommend you using anything, related to network operations on watch itself. First of all because Apple does not recommend to do it for obvious reasons. The only network thing that is performed on the watch directly is loading images.
I have been struggling with network operations and watch for like a week and came to a conclusion, that the most stable way to do it right now is not obvious.
The main issue is that WKInterfaceController.openParentApplication(...) does not work as expected. One can not just request to open iPhone app and give back the response as is. There are tons of solutions stating that creating backgound thread in - (void)application:(UIApplication *)application handleWatchKitExtensionRequest:(NSDictionary *)userInfo reply:(void (^)(NSDictionary *))reply would work just fine, but it actually does not. The problem is that this method has to send reply(...); right away. Even creating synchronious requests won't help, you will keep receiving "error -2 iPhone application did not reply.." like 5 times our of 10.
So, my solution is following:
You implement:
func requestUserToken() {
WKInterfaceController.openParentApplication(["request" : "token"], reply: responseParser)
}
and parse response for error that might occur if there's no response from iPhone.
On iOS side
- (void)application:(UIApplication *)application handleWatchKitExtensionRequest:(NSDictionary *)userInfo reply:(void (^)(NSDictionary *))reply
{
__block UIBackgroundTaskIdentifier watchKitHandler;
watchKitHandler = [[UIApplication sharedApplication] beginBackgroundTaskWithName:#"backgroundTask"
expirationHandler:^{
watchKitHandler = UIBackgroundTaskInvalid;
}];
NSString *request = userInfo[#"request"];
if ([request isEqualToString:#"token"])
{
reply(#{#"token" : #"OK"});
[PSWatchNetworkOperations.shared loginUser];
}
dispatch_after( dispatch_time( DISPATCH_TIME_NOW, (int64_t)NSEC_PER_SEC * 1 ), dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_DEFAULT, 0 ), ^{
[[UIApplication sharedApplication] endBackgroundTask:watchKitHandler];
} );
}
This code just creates a background thread that forces iPhone to send a network request. Let's imagine you would have a special class in your iPhone app that would send these requests and send the answer to watch. For now, this is only accomplishable using App Groups. So you have to create an app group for your application and watchkit extension. Afterwards, I would recommend using MMWormhole in order to establish communication between your app and extension. The manual is pretty self-explaining.
Now what's the point of all this. You have to implement sending request to server and send response through wormhole. I use ReactiveCocoa, so example from my code is like this:
- (void)fetchShoppingLists
{
RACSignal *signal = [PSHTTPClient.sharedAPIClient rac_GET:#"list/my" parameters:#{#"limit":#20, #"offset":#0} resultClass:PSShoppingListsModel.class];
[signal subscribeNext:^(PSShoppingListsModel* shoppingLists) {
[self.wormHole passMessageObject:shoppingLists identifier:#"shoppingLists"];
}];
[signal subscribeError:^(NSError *error) {
[self.wormHole passMessageObject:error identifier:#"error"];
}];
}
As you see here I send back either response object, or error. Note, that all that you send through wormhole should be NSCoding-compatible.
Now on the watch you'll probably parse response like this:
override func awakeWithContext(context: AnyObject?) {
super.awakeWithContext(context)
PSWatchOperations.sharedInstance.requestUserToken()
PSWatchOperations.sharedInstance.wormhole.listenForMessageWithIdentifier("token", listener: { (messageObject) -> Void in
// parse message object here
}
})
}
So, to make a conclusion. You send request to parent application to wake up from background and start async operation. Send reply() back immediately. When you receive answer from operation send notification that you've got response. Meanwhile listen to response in your watchExtension.
Sorry, that was a lot of text, but I just hope it helps keep one's ass cool, because I've spent a lot of nerves on that.
May be you can try to explain the exact problem a little more clearly. But one thing you may want to do regardless is to make the openParentApp call in awakeWithContext: instead of willActivate.

error unrecognized selector sent to class while adding Google Map SDK to iOS6

This is a single view application and I followed the instruction given at link
https://developers.google.com/maps/documentation/ios/start
for adding google map SDK to iOS6.
ERROR Is:
unrecognized selector sent to class 0xe2b0
2013-02-07 15:21:29.788 mapApp[2061:12e03] *** Terminating app due to uncaught exception
'NSInvalidArgumentException', reason: '+[GMSCameraPosition
cameraWithLatitude:longitude:zoom:]: unrecognized selector sent to class 0xe2b0'
AppDelegate.m
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// Override point for customization after application launch.
self.viewController = [[ViewController alloc] initWithNibName:#"ViewController" bundle:nil];
self.window.rootViewController = self.viewController;
//initializing google map api key
[GMSServices provideAPIKey:#"google's api key goes here"];
[self.window makeKeyAndVisible];
return YES;
}
ViewController.m
#import "ViewController.h"
#import <GoogleMaps/GoogleMaps.h>
#interface ViewController ()
#end
#implementation ViewController
{
GMSMapView *mapView;
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
GMSCameraPosition *cam = [GMSCameraPosition cameraWithLatitude:13.0245231
longitude:77.64072579999993
zoom:6];
mapView = [GMSMapView mapWithFrame:CGRectZero camera:cam];
mapView.myLocationEnabled = YES;
GMSMarkerOptions *options = [[GMSMarkerOptions alloc]init ];
options.position = CLLocationCoordinate2DMake(13.025738,77.637809);
options.title = #"ensign";
options.snippet = #"kalyan nagar";
[mapView addMarkerWithOptions:options];
}
main.m
#import <UIKit/UIKit.h>
#import <GoogleMaps/GoogleMaps.h>
int main(int argc, char *argv[])
{
#autoreleasepool {
return UIApplicationMain(argc, argv,nil, NSStringFromClass([AppDelegate class]));
}
}
While tracking the error it is showing at the return statement in main.m which comes from the method -viewDidLoad after executing the first line
GMSCameraPosition *cam = [GMSCameraPosition cameraWithLatitude:13.0245231
longitude:77.64072579999993
zoom:6];
It escapes the rest of the lines.
Did you add -ObjC to the Other Linker Flags, in step 7 of the instructions?
--
Extra information edit: note that -ObjC is case sensitive.
I had the same problem. Make sure you add the -ObjC flag to the 'Build Settings' of your 'Target' and NOT 'Project'.
P.S. Adding it in both places doesn't break it either.
Google Doc says
Choose your project, rather than a specific target, and open the Build Settings tab.
In the Other Linker Flags section, add -ObjC. If these settings are not visible, change the filter in the Build Settings bar from Basic to All.
Sometimes this is wrong....
I had to add the linker flag to the target as well, to get it to work. This should help someone

AVAudioPlayer don't work in AVAudioSessionCategoryAmbient mode in iOS 6.0.1?

In my app, I use a MPMusicPlayerController play .mp3 as background music,
and an AVAudioPlayer play sound-effect, just like button press, and so on.
The code is like this:
// Initialization code here.
AudioSessionInitialize(NULL, NULL, NULL, NULL);
AVAudioSession *session =[AVAudioSession sharedInstance];
//The background music and the sound-effect can play simultaneously
[session setCategory: AVAudioSessionCategoryAmbient error:nil];
[session setActive: YES error: nil];
m_sharedPlayer = [[MPMusicPlayerController applicationMusicPlayer] retain];
[m_sharedPlayer setShuffleMode: MPMusicShuffleModeSongs];
[m_sharedPlayer setRepeatMode: MPMusicRepeatModeAll];
[m_sharedPlayer setVolume:0.2];
// choose the first song
[m_sharedPlayer setQueueWithQuery: [MPMediaQuery songsQuery]];
[m_sharedPlayer play];
...
//when need play sound-effect, soundfilename is a NSString
NSData *data =[NSData dataWithContentsOfFile:[[NSBundle mainBundle] pathForResource:soundfilename ofType:nil]];
AVAudioPlayer *audioplayer = [[AVAudioPlayer alloc] initWithData:data error:nil];
audioplayer =1.0;
audioplayer.delegate =self;
[audioplayer prepareToPlay];
[audioplayer play];
...
audioplayer is release after it has finished play.
The code is work in iOS 5.0. But in iOS 6.0, everything changed.
AVAudioPlayer don't play sound anyway.
If I change this line:
[session setCategory: AVAudioSessionCategoryAmbient error:nil];
To:
[session setCategory: AVAudioSessionCategoryPlayback error: nil];
The AVAudioPlayer will play sound, but it will break the play session of MPMusicPlayerController...
How can I find a way to play AVAudioPlayer but without break the background music? Thanks a lot for your help.
OK. I find the solution finally. In iOS 6.0, apple provide a new function call setCategory:withOptions:. It work.
So the code is just like this:
AVAudioSession *session =[AVAudioSession sharedInstance];
float version = [[[UIDevice currentDevice] systemVersion] floatValue];
if (version <6.0) {
[session setCategory:AVAudioSessionCategoryAmbient error:nil];
}
else {
[session setCategory: AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionMixWithOthers error: nil];
}
Thanks.

Resources