First you will need to configure your applications audio session to allow bluetooth connections that support audio. You can do this in, for example, your application delegates - (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions method. Make sure you link the AVFoundation Framework and import in headers that will use it.
#import <AVFoundation/AVFoundation.h>// place in .h
[self prepareAudioSession];// called from application didFinishLaunchingWithOptions
- (BOOL)prepareAudioSession {
// deactivate session
BOOL success = [[AVAudioSession sharedInstance] setActive:NO error: nil];
if (!success) {
NSLog(@"deactivationError");
}
// set audio session category AVAudioSessionCategoryPlayAndRecord options AVAudioSessionCategoryOptionAllowBluetooth
success = [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionAllowBluetooth error:nil];
if (!success) {
NSLog(@"setCategoryError");
}
// activate audio session
success = [[AVAudioSession sharedInstance] setActive:YES error: nil];
if (!success) {
NSLog(@"activationError");
}
return success;
}
Every application has an audio session singleton that you can configure. The sessions category and mode (in this example I did not set the mode so it reverts to the default mode) declare your applications intentions as to how you would like audio routing to be handled. It follows an important rule of last in wins. This means that if the user plugs in a headset or in this case a bluetooth device that is a hands free peripheral (HFP) the system will automatically route the audio to the headset or bluetooth device. The users physical actions are used to determine audio routing. However if you wish to give the user a list of available routes Apple recommend using MPVolumeView class.
An example for adding MPVolumeView could be put in a UIViewController subclasses viewDidLoad method.
#import <MediaPlayer/MediaPlayer.h> // place in .h
// prefered way using MPVolumeView for user selecting audio routes
self.view.backgroundColor = [UIColor clearColor];
CGRect frameForMPVV = CGRectMake(50.0, 50.0, 100.0, 100.0);
MPVolumeView *routeView = [[MPVolumeView alloc] initWithFrame:frameForMPVV];
[routeView setShowsVolumeSlider:NO];
[routeView setShowsRouteButton:YES];
[self.view addSubview: routeView];
As of iOS 7 you can get all inputs like this
// portDesc.portType could be for example - BluetoothHFP, MicrophoneBuiltIn, MicrophoneWired
NSArray *availInputs = [[AVAudioSession sharedInstance] availableInputs];
int count = [availInputs count];
for (int k = 0; k < count; k++) {
AVAudioSessionPortDescription *portDesc = [availInputs objectAtIndex:k];
NSLog(@"input%i port type %@", k+1, portDesc.portType);
NSLog(@"input%i port name %@", k+1, portDesc.portName);
}
The portType you would be interested in is "BluetoothHFP". The portName property typically is the manufacturer/model which is what you would show to the user. (I've checked this with a non-LE bluetooth Motorola dinosaur and it works)
Because of the last in wins rule you will need to observe these two notifications (iOS 7 included). One to handle interruptions (such as phone calls or an alarm) and the second to be notified of route changes. Route change notifications is the one related to this question.
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(myInterruptionSelector:)
name:AVAudioSessionInterruptionNotification
object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(myRouteChangeSelector:)
name:AVAudioSessionRouteChangeNotification
object:nil];
For iOS 6.x you could read the currentRoute property of AVAudioSession inside the myRouteChange: selector to get the new route, as this will get called when a headset or bluetooth device is connected.
- (void)myRouteChangeSelector:(NSNotification*)notification {
AVAudioSessionRouteDescription *currentRoute = [[AVAudioSession sharedInstance] currentRoute];
NSArray *inputsForRoute = currentRoute.inputs;
NSArray *outputsForRoute = currentRoute.outputs;
AVAudioSessionPortDescription *outPortDesc = [outputsForRoute objectAtIndex:0];
NSLog(@"current outport type %@", outPortDesc.portType);
AVAudioSessionPortDescription *inPortDesc = [inputsForRoute objectAtIndex:0];
NSLog(@"current inPort type %@", inPortDesc.portType);
}
Any iOS version < 6.0 you'll need the 'now deprecated' AudioSessionServices class. This class is a C api that instead of notifications it allows you to add property listeners.
I'll finish on this note - YOU DONT ALWAYS GET WHAT YOU WANT from the system. There are interruption handling notifications to observe and lots of error checking needed. I think this is a really good question and I hope this sheds some light on what it is you are trying to achieve.