0
回答
iphone游戏的声音处理-流播放文件
利用AWS快速构建适用于生产的无服务器应用程序,免费试用12个月>>>   

流播放文件即用AudioStream 和 AudioQueue 来播放文件。好处是可以快速的开始播放,减少读文件的过程,适合大文件特别是背景音乐的播放。坏处是一次只能播放一个文件,如果要换播放文件,中间需要一定的时间。但是因为iPhone的文件读取时间只有10秒,对于资源较大的文件,只能考虑这个方式了。下面我将分享一下我在这方面的一点经验:1. 单个文件播放2. 在线文件播放

 

1. 单个文件播放

BOOL isPlaying;
/*-------------------USED FOR LOCAL FILE--------------------*/
AudioFileID audioFile;
AudioStreamBasicDescription dataFormat;
AudioStreamPacketDescription *packetDescs;

UInt64 packetIndex;
UInt32 numPacketsToRead;

BOOL repeat;
BOOL trackClosed;

/*--------------------USED FOR PUBLIC------------------------*/
BOOL trackEnded;

AudioQueueRef queue;
AudioQueueBufferRef buffers[NUM_QUEUE_BUFFERS];

以上是需要定义的为单独文件播放的所需要的元素。可以定义在类里面。

2. 在线文件播放

NSURL *url;

AudioFileStreamID audioFileStream; // the audio file stream parser
AudioStreamPacketDescription packetDescsQueue[kAQMaxPacketDescs]; // packet descriptions for enqueuing audio

CFReadStreamRef stream;

unsigned int fillBufferIndex; // the index of the audioQueueBuffer that is being filled
size_t bytesFilled; // how many bytes have been filled
size_t packetsFilled; // how many packets have been filled

bool inuse[kNumAQBufs]; // flags to indicate that a buffer is still in use
bool started; // flag to indicate that the queue has been started
bool failed; // flag to indicate an error occurred
bool discontinuous; // flag to trigger bug-avoidance

pthread_mutex_t mutex; // a mutex to protect the inuse flags
pthread_cond_t cond; // a condition varable for handling the inuse flags
pthread_mutex_t mutex2; // a mutex to protect the AudioQueue buffer
BOOL trackEnded;

AudioQueueRef queue;
AudioQueueBufferRef buffers[NUM_QUEUE_BUFFERS];

利用http1.1协议播放在线文件。以上是在线文件播放所需要的参数。

#define NUM_QUEUE_BUFFERS 3
#define kNumAQBufs 6 // number of audio queue buffers we allocate
#define kAQBufSize 32 * 1024 // number of bytes in each audio queue buffer
#define kAQMaxPacketDescs 512 // number of packet descriptions in our array

 

这里是定义的一些参数,NUM_QUEUE_BUFFERS 用于播放本地文件,而 kNumAQBufs 用于播放在线文件。


3. 本地文件初始化

- (id)initWithPath:(NSString*)path
{
UInt32 size, maxPacketSize;
char *cookie;
int i;

if (kxxxTrackActive)
{
NSLog(@"Other music is playing.");
return nil;
}

if (path == nil) return nil;
if(!(self = [super init])) return nil;

// try to open up the file using the specified path
if (noErr != AudioFileOpenURL((CFURLRef)[NSURL fileURLWithPath:path], 0x01, 0, &audioFile))
{
NSLog(@"File can not be opened!");
return nil;
}

// get the data format of the file
size = sizeof(dataFormat);
AudioFileGetProperty(audioFile, kAudioFilePropertyDataFormat, &size, &dataFormat);

// create a new playback queue using the specified data format and buffer callback
AudioQueueNewOutput(&dataFormat, BufferCallback, self, nil, nil, 0, &queue);

// calculate number of packets to read and allocate space for packet descriptions if needed
if (dataFormat.mBytesPerPacket == 0 || dataFormat.mFramesPerPacket == 0)
{
// Ask Core Audio to give us a conservative estimate of the largest packet
size = sizeof(maxPacketSize);
AudioFileGetProperty(audioFile, kAudioFilePropertyPacketSizeUpperBound, &size, &maxPacketSize);
if (maxPacketSize > kxxxBufferSizeBytes)
{
/*Limitation for the maximum buffer size*/
maxPacketSize = kxxxBufferSizeBytes;
NSLog(@"Size out of bounds!");
}
// calculate how many packs to read
numPacketsToRead = kxxxBufferSizeBytes / maxPacketSize;

// will need a packet description for each packet to allocate space accordingly
packetDescs = malloc(sizeof(AudioStreamPacketDescription) * numPacketsToRead);
}
else
{
// constant bitrate
numPacketsToRead = kxxxBufferSizeBytes / dataFormat.mBytesPerPacket;

// don't need packet descriptions for CBR data
packetDescs = nil;
}

// see if file uses a magic cookie (a magic cookie is meta data which some formats use)
AudioFileGetPropertyInfo(audioFile, kAudioFilePropertyMagicCookieData, &size, nil);
if (size > 0)
{
// copy the cookie data from the file into the audio queue
cookie = malloc(sizeof(char) * size);
AudioFileGetProperty(audioFile, kAudioFilePropertyMagicCookieData, &size, cookie);
AudioQueueSetProperty(queue, kAudioQueueProperty_MagicCookie, cookie, size);
free(cookie);
}

// we want to know when the playing state changes so we can properly dispose of the audio queue when it's done
AudioQueueAddPropertyListener(queue, kAudioQueueProperty_IsRunning, propertyListenerCallback, self);

// allocate and prime buffers with some data
packetIndex = 0;
for (i = 0; i < NUM_QUEUE_BUFFERS; i++)
{
AudioQueueAllocateBuffer(queue, kxxxBufferSizeBytes, &buffers);
if ([self readPacketsIntoBuffer:buffers] == 0)
{
// this might happen if the file was so short that it needed less buffers than we planned on using
break;
}
}
repeat = NO;
trackClosed = NO;
trackEnded = NO;
kxxxTrackActive = YES;
return self;
}

 

4. 在线文件初始化

- (id)initWithURL:(NSURL*)newUrl
{
self = [super init];
if (self != nil)
{
url = [newUrl retain];
}
return self;
}


算了,废话不多说了,直接上代码,等以后有时间了再逐一解释。

.h文件

#ifdef TARGET_OS_IPHONE 
#import <UIKit/UIKit.h>
#else
#import <Cocoa/Cocoa.h>
#endif TARGET_OS_IPHONE 


#import <AudioToolbox/AudioQueue.h>
#import <AudioToolbox/AudioFile.h>
#include <pthread.h>
#include <AudioToolbox/AudioToolbox.h>


#define NUM_QUEUE_BUFFERS 3
#define kNumAQBufs 6 // number of audio queue buffers we allocate
#define kAQBufSize 32 * 1024 // number of bytes in each audio queue buffer
#define kAQMaxPacketDescs 512 // number of packet descriptions in our array


@interface xxxxx : NSObject {
/*-----------------USED FOR HTTP STREAM------------------*/
NSURL *url;
BOOL isPlaying;

@public
AudioFileStreamID audioFileStream; // the audio file stream parser
AudioStreamPacketDescription packetDescsQueue[kAQMaxPacketDescs]; // packet descriptions for enqueuing audio

CFReadStreamRef stream;

unsigned int fillBufferIndex; // the index of the audioQueueBuffer that is being filled
size_t bytesFilled; // how many bytes have been filled
size_t packetsFilled; // how many packets have been filled

bool inuse[kNumAQBufs]; // flags to indicate that a buffer is still in use
bool started; // flag to indicate that the queue has been started
bool failed; // flag to indicate an error occurred
bool discontinuous; // flag to trigger bug-avoidance

pthread_mutex_t mutex; // a mutex to protect the inuse flags
pthread_cond_t cond; // a condition varable for handling the inuse flags
pthread_mutex_t mutex2; // a mutex to protect the AudioQueue buffer

/*-------------------USED FOR LOCAL FILE--------------------*/
AudioFileID audioFile;
AudioStreamBasicDescription dataFormat;
AudioStreamPacketDescription *packetDescs;

UInt64 packetIndex;
UInt32 numPacketsToRead;

BOOL repeat;
BOOL trackClosed;

/*--------------------USED FOR PUBLIC------------------------*/
BOOL trackEnded;

AudioQueueRef queue;
AudioQueueBufferRef buffers[NUM_QUEUE_BUFFERS];
}


@property BOOL isPlaying;
@property BOOL trackClosed;


- (id) initWithURL:(NSURL*) newURL;
- (id) initWithPath:(NSString*) path;


- (void) setGain:(Float32)gain;
- (void) setRepeat:(BOOL)yn;
- (void) setPlayingWhenAutoLock;


- (void) play;
- (void) playURL;


- (void) pause;
- (void) stopURL;


- (void) close;


extern NSString *xxxTrackFinishedPlayingNotification;


@end

 

 

.m文件

#import "xxxxx.h" 
#import <CFNetwork/CFNetwork.h>


static UInt32 kxxxBufferSizeBytes = 0x10000; // 64k
static BOOL kxxxTrackActive = NO;
NSString *xxxTrackFinishedPlayingNotification = @"xxxTrackFinishedPlayingNotification";


#pragma mark -
#pragma mark CFReadStream Callback Function Prototypes


void ReadStreamCallBack(CFReadStreamRef stream, CFStreamEventType eventType, void* dataIn);


#pragma mark -
#pragma mark Audio Callback Function Prototypes


void MyAudioQueueOutputCallback(void* inClientData, AudioQueueRef inAQ, AudioQueueBufferRef inBuffer);
void MyAudioQueueIsRunningCallback(void *inUserData, AudioQueueRef inAQ, AudioQueuePropertyID inID);
void MyPropertyListenerProc(void *inClientData, AudioFileStreamID inAudioFileStream, AudioFileStreamPropertyID inPropertyID, UInt32 *ioFlags);
void MyPacketsProc(void *inClientData, UInt32 inNumberBytes, UInt32 inNumberPackets, const void *inInputData, AudioStreamPacketDescription *inPacketDescriptions);
OSStatus MyEnqueueBuffer(xxxxx* myData);


#ifdef TARGET_OS_IPHONE 
void MyAudioSessionInterruptionListener(void *inClientData, UInt32 inInterruptionState);
#endif


#pragma mark -
#pragma mark Audio Callback Function Implementations


//
// MyPropertyListenerProc
//
// Receives notification when the AudioFileStream has audio packets to be
// played. In response, this function creates the AudioQueue, getting it
// ready to begin playback (playback won't begin until audio packets are
// sent to the queue in MyEnqueueBuffer).
//
// This function is adapted from Apple's example in AudioFileStreamExample with
// kAudioQueueProperty_IsRunning listening added.
//
void MyPropertyListenerProc(void *inClientData, AudioFileStreamID inAudioFileStream, AudioFileStreamPropertyID inPropertyID, UInt32 *ioFlags)
{ 
// this is called by audio file stream when it finds property values
xxxxx* myData = (xxxxx*)inClientData;
OSStatus err = noErr;

switch (inPropertyID) {
case kAudioFileStreamProperty_ReadyToProducePackets :
{
myData->discontinuous = true;

// the file stream parser is now ready to produce audio packets.
// get the stream format.
AudioStreamBasicDescription asbd;
UInt32 asbdSize = sizeof(asbd);
err = AudioFileStreamGetProperty(inAudioFileStream, kAudioFileStreamProperty_DataFormat, &asbdSize, &asbd);
if (err) { NSLog(@"get kAudioFileStreamProperty_DataFormat"); myData->failed = true; break; }

// create the audio queue
err = AudioQueueNewOutput(&asbd, MyAudioQueueOutputCallback, myData, NULL, NULL, 0, &myData->queue);
if (err) { NSLog(@"AudioQueueNewOutput"); myData->failed = true; break; }

// listen to the "isRunning" property
err = AudioQueueAddPropertyListener(myData->queue, kAudioQueueProperty_IsRunning, MyAudioQueueIsRunningCallback, myData);
if (err) { NSLog(@"AudioQueueAddPropertyListener"); myData->failed = true; break; }

// allocate audio queue buffers
for (unsigned int i = 0; i < kNumAQBufs; ++i) {
err = AudioQueueAllocateBuffer(myData->queue, kAQBufSize, &myData->buffers);
if (err) { NSLog(@"AudioQueueAllocateBuffer"); myData->failed = true; break; }
}

// get the cookie size
UInt32 cookieSize;
Boolean writable;
err = AudioFileStreamGetPropertyInfo(inAudioFileStream, kAudioFileStreamProperty_MagicCookieData, &cookieSize, &writable);
if (err) { NSLog(@"info kAudioFileStreamProperty_MagicCookieData"); break; }

// get the cookie data
void* cookieData = calloc(1, cookieSize);
err = AudioFileStreamGetProperty(inAudioFileStream, kAudioFileStreamProperty_MagicCookieData, &cookieSize, cookieData);
if (err) { NSLog(@"get kAudioFileStreamProperty_MagicCookieData"); free(cookieData); break; }

// set the cookie on the queue.
err = AudioQueueSetProperty(myData->queue, kAudioQueueProperty_MagicCookie, cookieData, cookieSize);
free(cookieData);
if (err) { NSLog(@"set kAudioQueueProperty_MagicCookie"); break; }
break;
}
}
}


//
// MyPacketsProc
//
// When the AudioStream has packets to be played, this function gets an
// idle audio buffer and copies the audio packets into it. The calls to
// MyEnqueueBuffer won't return until there are buffers available (or the
// playback has been stopped).
//
// This function is adapted from Apple's example in AudioFileStreamExample with
// CBR functionality added.
//
void MyPacketsProc(void *inClientData, UInt32 inNumberBytes, UInt32 inNumberPackets, const void *inInputData, AudioStreamPacketDescription *inPacketDescriptions)
{
// this is called by audio file stream when it finds packets of audio
xxxxx* myData = (xxxxx*)inClientData;

// we have successfully read the first packests from the audio stream, so
// clear the "discontinuous" flag
myData->discontinuous = false;

// the following code assumes we're streaming VBR data. for CBR data, the second branch is used.
if (inPacketDescriptions)
{
for (int i = 0; i < inNumberPackets; ++i) {
SInt64 packetOffset = inPacketDescriptions.mStartOffset;
SInt64 packetSize   = inPacketDescriptions.mDataByteSize;

// If the audio was terminated before this point, then
// exit.
if (myData->trackEnded)
{
return;
}

// if the space remaining in the buffer is not enough for this packet, then enqueue the buffer.
size_t bufSpaceRemaining = kAQBufSize - myData->bytesFilled;
if (bufSpaceRemaining < packetSize) {
MyEnqueueBuffer(myData);
}

pthread_mutex_lock(&myData->mutex2);

// If the audio was terminated while waiting for a buffer, then
// exit.
if (myData->trackEnded)
{
pthread_mutex_unlock(&myData->mutex2);
return;
}

// copy data to the audio queue buffer
AudioQueueBufferRef fillBuf = myData->buffers[myData->fillBufferIndex];
memcpy((char*)fillBuf->mAudioData + myData->bytesFilled, (const char*)inInputData + packetOffset, packetSize);

pthread_mutex_unlock(&myData->mutex2);

// fill out packet description
myData->packetDescsQueue[myData->packetsFilled] = inPacketDescriptions;
myData->packetDescsQueue[myData->packetsFilled].mStartOffset = myData->bytesFilled;
// keep track of bytes filled and packets filled
myData->bytesFilled += packetSize;
myData->packetsFilled += 1;

// if that was the last free packet description, then enqueue the buffer.
size_t packetsDescsRemaining = kAQMaxPacketDescs - myData->packetsFilled;
if (packetsDescsRemaining == 0) {
MyEnqueueBuffer(myData);
}
} 
}
else
{
size_t offset = 0;
while (inNumberBytes)
{
// if the space remaining in the buffer is not enough for this packet, then enqueue the buffer.
size_t bufSpaceRemaining = kAQBufSize - myData->bytesFilled;
if (bufSpaceRemaining < inNumberBytes) {
MyEnqueueBuffer(myData);
}

pthread_mutex_lock(&myData->mutex2);

// If the audio was terminated while waiting for a buffer, then
// exit.
if (myData->trackEnded)
{
pthread_mutex_unlock(&myData->mutex2);
return;
}

// copy data to the audio queue buffer
AudioQueueBufferRef fillBuf = myData->buffers[myData->fillBufferIndex];
bufSpaceRemaining = kAQBufSize - myData->bytesFilled;
size_t copySize;
if (bufSpaceRemaining < inNumberBytes)
{
copySize = bufSpaceRemaining;
}
else
{
copySize = inNumberBytes;
}
memcpy((char*)fillBuf->mAudioData + myData->bytesFilled, (const char*)(inInputData + offset), copySize);

pthread_mutex_unlock(&myData->mutex2);

// keep track of bytes filled and packets filled
myData->bytesFilled += copySize;
myData->packetsFilled = 0;
inNumberBytes -= copySize;
offset += copySize;
}
}
}


//
// MyEnqueueBuffer
//
// Called from MyPacketsProc and connectionDidFinishLoading to pass filled audio
// bufffers (filled by MyPacketsProc) to the AudioQueue for playback. This
// function does not return until a buffer is idle for further filling or
// the AudioQueue is stopped.
//
// This function is adapted from Apple's example in AudioFileStreamExample with
// CBR functionality added.
//
OSStatus MyEnqueueBuffer(xxxxx* myData)
{
OSStatus err = noErr;
myData->inuse[myData->fillBufferIndex] = true; // set in use flag

// enqueue buffer
AudioQueueBufferRef fillBuf = myData->buffers[myData->fillBufferIndex];
fillBuf->mAudioDataByteSize = myData->bytesFilled;

if (myData->packetsFilled)
{
err = AudioQueueEnqueueBuffer(myData->queue, fillBuf, myData->packetsFilled, myData->packetDescsQueue);
}
else
{
err = AudioQueueEnqueueBuffer(myData->queue, fillBuf, 0, NULL);
}

if (err) { NSLog(@"AudioQueueEnqueueBuffer"); myData->failed = true; return err; } 

if (!myData->started) { // start the queue if it has not been started already
err = AudioQueueStart(myData->queue, NULL);
if (err) { NSLog(@"AudioQueueStart"); myData->failed = true; return err; } 
myData->started = true;
}

// go to next buffer
if (++myData->fillBufferIndex >= kNumAQBufs) myData->fillBufferIndex = 0;
myData->bytesFilled = 0; // reset bytes filled
myData->packetsFilled = 0; // reset packets filled

// wait until next buffer is not in use
pthread_mutex_lock(&myData->mutex); 
while (myData->inuse[myData->fillBufferIndex] && !myData->trackEnded)
{
pthread_cond_wait(&myData->cond, &myData->mutex);
}
pthread_mutex_unlock(&myData->mutex);

return err;
}


//
// MyFindQueueBuffer
//
// Returns the index of the specified buffer in the audioQueueBuffer array.
//
// This function is unchanged from Apple's example in AudioFileStreamExample.
//
int MyFindQueueBuffer(xxxxx* myData, AudioQueueBufferRef inBuffer)
{
for (unsigned int i = 0; i < kNumAQBufs; ++i) {
if (inBuffer == myData->buffers) 
return i;
}
return -1;
}


//
// MyAudioQueueOutputCallback
//
// Called from the AudioQueue when playback of specific buffers completes. This
// function signals from the AudioQueue thread to the AudioStream thread that
// the buffer is idle and available for copying data.
//
// This function is unchanged from Apple's example in AudioFileStreamExample.
//
void MyAudioQueueOutputCallback(void* inClientData, AudioQueueRef inAQ, AudioQueueBufferRef inBuffer)
{
// this is called by the audio queue when it has finished decoding our data. 
// The buffer is now free to be reused.
xxxxx* myData = (xxxxx*)inClientData;
unsigned int bufIndex = MyFindQueueBuffer(myData, inBuffer);

// signal waiting thread that the buffer is free.
pthread_mutex_lock(&myData->mutex);
myData->inuse[bufIndex] = false;
pthread_cond_signal(&myData->cond);
pthread_mutex_unlock(&myData->mutex);
}


//
// MyAudioQueueIsRunningCallback
//
// Called from the AudioQueue when playback is started or stopped. This
// information is used to toggle the observable "isPlaying" property and
// set the "finished" flag.
//
void MyAudioQueueIsRunningCallback(void *inUserData, AudioQueueRef inAQ, AudioQueuePropertyID inID)
{
xxxxx *myData = (xxxxx *)inUserData;
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

if (myData.isPlaying)
{
myData->trackEnded = true;
myData.isPlaying = false;

#ifdef TARGET_OS_IPHONE 
AudioSessionSetActive(false);
#endif
}
else
{
myData.isPlaying = true;
if (myData->trackEnded)
{
myData.isPlaying = false;
}

//
// Note about this bug avoidance quirk:
//
// On cleanup of the AudioQueue thread, on rare occasions, there would
// be a crash in CFSetContainsValue as a CFRunLoopObserver was getting
// removed from the CFRunLoop.
//
// After lots of testing, it appeared that the audio thread was
// attempting to remove CFRunLoop observers from the CFRunLoop after the
// thread had already deallocated the run loop.
//
// By creating an NSRunLoop for the AudioQueue thread, it changes the
// thread destruction order and seems to avoid this crash bug -- or
// at least I haven't had it since (nasty hard to reproduce error!)
//
[NSRunLoop currentRunLoop];
}

[pool release];
}


#ifdef TARGET_OS_IPHONE 
//
// MyAudioSessionInterruptionListener
//
// Invoked if the audio session is interrupted (like when the phone rings)
//
void MyAudioSessionInterruptionListener(void *inClientData, UInt32 inInterruptionState)
{
}
#endif


#pragma mark -
#pragma mark CFReadStream Callback Function Implementations


//
// ReadStreamCallBack
//
// This is the callback for the CFReadStream from the network connection. This
// is where all network data is passed to the AudioFileStream.
//
// Invoked when an error occurs, the stream ends or we have data to read.
//
void ReadStreamCallBack(CFReadStreamRef stream, CFStreamEventType eventType, void* dataIn)
{
xxxxx *myData = (xxxxx *)dataIn;

if (eventType == kCFStreamEventErrorOccurred)
{
myData->failed = YES;
}
else if (eventType == kCFStreamEventEndEncountered)
{
if (myData->failed || myData->trackEnded)
{
return;
}

//
// If there is a partially filled buffer, pass it to the AudioQueue for
// processing
//
if (myData->bytesFilled)
{
MyEnqueueBuffer(myData);
}

//
// If the AudioQueue started, then flush it (to make certain everything
// sent thus far will be processed) and subsequently stop the queue.
//
if (myData->started)
{
OSStatus err = AudioQueueFlush(myData->queue);
if (err) { NSLog(@"AudioQueueFlush"); return; }

err = AudioQueueStop(myData->queue, false);
if (err) { NSLog(@"AudioQueueStop"); return; }

CFReadStreamClose(stream);
CFRelease(stream);
myData->stream = nil;
}
else
{
//
// If we have reached the end of the file without starting, then we
// have failed to find any audio in the file. Abort.
//
myData->failed = YES;
}
}
else if (eventType == kCFStreamEventHasBytesAvailable)
{
if (myData->failed || myData->trackEnded)
{
return;
}

//
// Read the bytes from the stream
//
UInt8 bytes[kAQBufSize];
CFIndex length = CFReadStreamRead(stream, bytes, kAQBufSize);

if (length == -1)
{
myData->failed = YES;
return;
}

//
// Parse the bytes read by sending them through the AudioFileStream
//
if (length > 0)
{
if (myData->discontinuous)
{
OSStatus err = AudioFileStreamParseBytes(myData->audioFileStream, length, bytes, kAudioFileStreamParseFlag_Discontinuity);
if (err) { NSLog(@"AudioFileStreamParseBytes"); myData->failed = true;}
}
else
{
OSStatus err = AudioFileStreamParseBytes(myData->audioFileStream, length, bytes, 0);
if (err) { NSLog(@"AudioFileStreamParseBytes"); myData->failed = true; }
}
}
}
}


@interface xxxxx (private)


static void propertyListenerCallback(void *inUserData, AudioQueueRef queueObject, AudioQueuePropertyID propertyID);
- (void) playBackIsRunningStateChanged;


static void BufferCallback(void *inUserData, AudioQueueRef inAQ, AudioQueueBufferRef buffer);
- (void) callbackForBuffer:(AudioQueueBufferRef)buffer;
- (UInt32) readPacketsIntoBuffer:(AudioQueueBufferRef)buffer;


@end


@implementation xxxxx


@synthesize isPlaying, trackClosed;


#pragma mark -
#pragma mark xxxxx


- (void)dealloc
{
[self close];
if (packetDescs != nil)
free(packetDescs);
[url release];
[super dealloc];
}


- (void)close
{
// it is preferrable to call close first, if there is a problem waiting for an autorelease
if (trackClosed)
return;
trackClosed = YES;
AudioQueueStop(queue, YES); // <-- YES means stop immediately
AudioQueueDispose(queue, YES);
AudioFileClose(audioFile);
kxxxTrackActive = NO;
}


- (id)initWithURL:(NSURL*)newUrl
{
self = [super init];
if (self != nil)
{
url = [newUrl retain];
}
return self;
}


- (id)initWithPath:(NSString*)path
{
UInt32 size, maxPacketSize;
char *cookie;
int i;

if (kxxxTrackActive)
{
NSLog(@"Other music is playing.");
return nil;
}

if (path == nil) return nil;
if(!(self = [super init])) return nil;

// try to open up the file using the specified path
if (noErr != AudioFileOpenURL((CFURLRef)[NSURL fileURLWithPath:path], 0x01, 0, &audioFile))
{
NSLog(@"File can not be opened!");
return nil;
}

// get the data format of the file
size = sizeof(dataFormat);
AudioFileGetProperty(audioFile, kAudioFilePropertyDataFormat, &size, &dataFormat);

// create a new playback queue using the specified data format and buffer callback
AudioQueueNewOutput(&dataFormat, BufferCallback, self, nil, nil, 0, &queue);

// calculate number of packets to read and allocate space for packet descriptions if needed
if (dataFormat.mBytesPerPacket == 0 || dataFormat.mFramesPerPacket == 0)
{
// Ask Core Audio to give us a conservative estimate of the largest packet
size = sizeof(maxPacketSize);
AudioFileGetProperty(audioFile, kAudioFilePropertyPacketSizeUpperBound, &size, &maxPacketSize);
if (maxPacketSize > kxxxBufferSizeBytes)
{
/*Limitation for the maximum buffer size*/
maxPacketSize = kxxxBufferSizeBytes;
NSLog(@"Size out of bounds!");
}
// calculate how many packs to read
numPacketsToRead = kxxxBufferSizeBytes / maxPacketSize;

// will need a packet description for each packet to allocate space accordingly
packetDescs = malloc(sizeof(AudioStreamPacketDescription) * numPacketsToRead);
}
else
{
// constant bitrate
numPacketsToRead = kxxxBufferSizeBytes / dataFormat.mBytesPerPacket;

// don't need packet descriptions for CBR data
packetDescs = nil;
}

// see if file uses a magic cookie (a magic cookie is meta data which some formats use)
AudioFileGetPropertyInfo(audioFile, kAudioFilePropertyMagicCookieData, &size, nil);
if (size > 0)
{
// copy the cookie data from the file into the audio queue
cookie = malloc(sizeof(char) * size);
AudioFileGetProperty(audioFile, kAudioFilePropertyMagicCookieData, &size, cookie);
AudioQueueSetProperty(queue, kAudioQueueProperty_MagicCookie, cookie, size);
free(cookie);
}

// we want to know when the playing state changes so we can properly dispose of the audio queue when it's done
AudioQueueAddPropertyListener(queue, kAudioQueueProperty_IsRunning, propertyListenerCallback, self);

// allocate and prime buffers with some data
packetIndex = 0;
for (i = 0; i < NUM_QUEUE_BUFFERS; i++)
{
AudioQueueAllocateBuffer(queue, kxxxBufferSizeBytes, &buffers);
if ([self readPacketsIntoBuffer:buffers] == 0)
{
// this might happen if the file was so short that it needed less buffers than we planned on using
break;
}
}
repeat = NO;
trackClosed = NO;
trackEnded = NO;
kxxxTrackActive = YES;
return self;
}


- (void) setGain:(Float32) gain
{
if (trackClosed)
return;
AudioQueueSetParameter(queue, kAudioQueueParam_Volume, gain);
}


- (void) setRepeat:(BOOL) yn
{
repeat = yn;
}


- (void) play
{
if (trackClosed)
return;

OSStatus result = AudioQueuePrime(queue, 1, nil); 
if (result)
{
NSLog(@"play: error priming AudioQueue");
return;
}
AudioQueueStart(queue, nil);
}


- (void) playURL{
[NSThread detachNewThreadSelector:@selector(startPlay) toTarget:self withObject:nil];
}


- (void) stopURL
{
if (stream)
{
CFReadStreamClose(stream);
        CFRelease(stream);
stream = nil;

if (trackEnded)
{
return;
}

if (started)
{
//
// Set finished to true *before* we call stop. This is to handle our
// third thread...
// - This method is called from main (UI) thread
// - The AudioQueue thread (which owns the AudioQueue buffers nad
// will delete them as soon as we call AudioQueueStop)
// - URL connection thread is copying data from AudioStream to
// AudioQueue buffer
// We set this flag to tell the URL connection thread to stop
// copying.
//
pthread_mutex_lock(&mutex2);
trackEnded = true;

OSStatus err = AudioQueueStop(queue, true);
if (err) { NSLog(@"AudioQueueStop"); }
pthread_mutex_unlock(&mutex2);

pthread_mutex_lock(&mutex);
pthread_cond_signal(&cond);
pthread_mutex_unlock(&mutex);
}
else
{
trackEnded = true;
self.isPlaying = YES;
self.isPlaying = NO;
}
}
}


- (void) setPlayingWhenAutoLock {
#ifdef TARGET_OS_IPHONE 
// Set the audio session category so that we continue to play if the iPhone/iPod auto-locks.
AudioSessionInitialize (NULL,                          // 'NULL' to use the default (main) run loop
NULL,                          // 'NULL' to use the default run loop mode
MyAudioSessionInterruptionListener,  // a reference to your interruption callback
self                       // data to pass to your interruption listener callback
);
UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
AudioSessionSetProperty (kAudioSessionProperty_AudioCategory, sizeof (sessionCategory), &sessionCategory);
AudioSessionSetActive(true);
#endif
}


- (void) startPlay{
[self retain];

//NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

//
// Attempt to guess the file type from the URL. Reading the MIME type
// from the CFReadStream would be a better approach since lots of
// URL's don't have the right extension.
//
// If you have a fixed file-type, you may want to hardcode this.
//
AudioFileTypeID fileTypeHint = kAudioFileMP3Type;
NSString *fileExtension = [[url path] pathExtension];
if ([fileExtension isEqual:@"mp3"])
{
fileTypeHint = kAudioFileMP3Type;
}
else if ([fileExtension isEqual:@"wav"])
{
fileTypeHint = kAudioFileWAVEType;
}
else if ([fileExtension isEqual:@"aifc"])
{
fileTypeHint = kAudioFileAIFCType;
}
else if ([fileExtension isEqual:@"aiff"])
{
fileTypeHint = kAudioFileAIFFType;
}
else if ([fileExtension isEqual:@"m4a"])
{
fileTypeHint = kAudioFileM4AType;
}
else if ([fileExtension isEqual:@"mp4"])
{
fileTypeHint = kAudioFileMPEG4Type;
}
else if ([fileExtension isEqual:@"caf"])
{
fileTypeHint = kAudioFileCAFType;
}
else if ([fileExtension isEqual:@"aac"])
{
fileTypeHint = kAudioFileAAC_ADTSType;
}

// initialize a mutex and condition so that we can block on buffers in use.
pthread_mutex_init(&mutex, NULL);
pthread_cond_init(&cond, NULL);
pthread_mutex_init(&mutex2, NULL);

// create an audio file stream parser
OSStatus err = AudioFileStreamOpen(self, MyPropertyListenerProc, MyPacketsProc, fileTypeHint, &audioFileStream);
if (err) { NSLog(@"AudioFileStreamOpen"); goto cleanup; }

//
// Create the GET request
//
CFHTTPMessageRef message= CFHTTPMessageCreateRequest(NULL, (CFStringRef)@"GET", (CFURLRef)url, kCFHTTPVersion1_1);
stream = CFReadStreamCreateForHTTPRequest(NULL, message);
    CFRelease(message);
if (!CFReadStreamOpen(stream))
{
        CFRelease(stream);
goto cleanup;
    }

//
// Set our callback function to receive the data
//
CFStreamClientContext context = {0, self, NULL, NULL, NULL};
CFReadStreamSetClient(stream, kCFStreamEventHasBytesAvailable | kCFStreamEventErrorOccurred | kCFStreamEventEndEncountered, ReadStreamCallBack, &context);
CFReadStreamScheduleWithRunLoop(stream, CFRunLoopGetCurrent(), kCFRunLoopCommonModes);

//
// Process the run loop until playback is finished or failed.
//
do
{
CFRunLoopRunInMode(kCFRunLoopDefaultMode, 0.25, false);

if (failed)
{
[self stopURL];

#ifdef TARGET_OS_IPHONE 
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:NSLocalizedStringFromTable(@"Audio Error", @"Errors", nil) message:NSLocalizedStringFromTable(@"Attempt to play streaming audio failed.", @"Errors", nil) delegate:self cancelButtonTitle:@"OK" otherButtonTitles: nil];
[alert performSelector:@selector(show) onThread:[NSThread mainThread] withObject:nil waitUntilDone:YES];
[alert release];
#else
NSAlert *alert = [NSAlert alertWithMessageText:NSLocalizedString(@"Audio Error", @"") defaultButton:NSLocalizedString(@"OK", @"") alternateButton:nil otherButton:nil informativeTextWithFormat:@"Attempt to play streaming audio failed."];
[alert performSelector:@selector(runModal) onThread:[NSThread mainThread] withObject:nil waitUntilDone:NO];
#endif

break;
}
} while (isPlaying || !trackEnded);

cleanup:

//
// Cleanup the read stream if it is still open
//
if (stream)
{
CFReadStreamClose(stream);
        CFRelease(stream);
stream = nil;
}

//
// Close the audio file strea,
//
err = AudioFileStreamClose(audioFileStream);
if (err) { NSLog(@"AudioFileStreamClose"); goto cleanup; }

//
// Dispose of the Audio Queue
//
if (started)
{
err = AudioQueueDispose(queue, true);
if (err) { NSLog(@"AudioQueueDispose"); goto cleanup; }
}

//[pool release];
[self release];
}


- (void)pause
{
if (trackClosed)
return;
AudioQueuePause(queue);
}


#pragma mark -
#pragma mark Callback


static void propertyListenerCallback(void *inUserData, AudioQueueRef queueObject, AudioQueuePropertyID propertyID)
{
// redirect back to the class to handle it there instead, so we have direct access to the instance variables
if (propertyID == kAudioQueueProperty_IsRunning)
[(xxxxx*)inUserData playBackIsRunningStateChanged];
}


- (void)playBackIsRunningStateChanged
{
if (trackEnded)
{
// go ahead and close the track now
trackClosed = YES;
AudioQueueDispose(queue, YES);
AudioFileClose(audioFile);
kxxxTrackActive = NO;

// we're not in the main thread during this callback, so enqueue a message on the main thread to post notification
// that we're done, or else the notification will have to be handled in this thread, making things more difficult
[self performSelectorOnMainThread:@selector(postTrackFinishedPlayingNotification:) withObject:nil waitUntilDone:NO];
}
}


static void BufferCallback(void *inUserData, AudioQueueRef inAQ, AudioQueueBufferRef buffer)
{
// redirect back to the class to handle it there instead, so we have direct access to the instance variables
[(xxxxx*)inUserData callbackForBuffer:buffer];
}


- (void) callbackForBuffer:(AudioQueueBufferRef) buffer
{
// I guess it's possible for the callback to continue to be called since this is in another thread, so to be safe,
// don't do anything else if the track is closed, and also don't bother reading anymore packets if the track ended
if (trackClosed || trackEnded)
return;

if ([self readPacketsIntoBuffer:buffer] == 0)
{
if (repeat)
{
// End Of File reached, so rewind and refill the buffer using the beginning of the file instead
packetIndex = 0;
[self readPacketsIntoBuffer:buffer];
}
else
{
// set it to stop, but let it play to the end, where the property listener will pick up that it actually finished
AudioQueueStop(queue, NO);
trackEnded = YES;
}
}
}


- (void) postTrackFinishedPlayingNotification:(id) object
{
// if we're here then we're in the main thread as specified by the callback, so now we can post notification that
// the track is done without the notification observer(s) having to worry about thread safety and autorelease pools
[[NSNotificationCenter defaultCenter] postNotificationName:xxxTrackFinishedPlayingNotification object:self];
}


- (UInt32)readPacketsIntoBuffer:(AudioQueueBufferRef)buffer
{
UInt32 numBytes, numPackets;

// read packets into buffer from file
numPackets = numPacketsToRead;
AudioFileReadPackets(audioFile, NO, &numBytes, packetDescs, packetIndex, &numPackets, buffer->mAudioData);
if (numPackets > 0)
{
// - End Of File has not been reached yet since we read some packets, so enqueue the buffer we just read into
// the audio queue, to be played next
// - (packetDescs ? numPackets : 0) means that if there are packet descriptions (which are used only for Variable
// BitRate data (VBR)) we'll have to send one for each packet, otherwise zero
buffer->mAudioDataByteSize = numBytes;
AudioQueueEnqueueBuffer(queue, buffer, (packetDescs ? numPackets : 0), packetDescs);

// move ahead to be ready for next time we need to read from the file
packetIndex += numPackets;
}
return numPackets;
}


@end

 

代码中对指针形式和数组形式2种各写了不同的代码,大家可以根据个人爱好使用,而且还将url和文件path2种播放方式写在了一起。这个类是xxxxx.h 


原文链接:http://blog.csdn.net/favormm/article/details/5282334
<无标签>
举报
长平狐
发帖于5年前 0回/260阅
顶部