Chromium Code Reviews
chromiumcodereview-hr@appspot.gserviceaccount.com (chromiumcodereview-hr) | Please choose your nickname with Settings | Help | Chromium Project | Gerrit Changes | Sign out
(152)

Side by Side Diff: media/audio/win/audio_low_latency_output_win.h

Issue 12049070: Avoids irregular OnMoreData callbacks on Windows using Core Audio (Closed) Base URL: svn://svn.chromium.org/chrome/trunk/src
Patch Set: cleaned up Created 7 years, 10 months ago
Use n/p to move between diff chunks; N/P to move between comments. Draft comments are only viewable by you.
Jump to:
View unified diff | Download patch | Annotate | Revision Log
OLDNEW
1 // Copyright (c) 2012 The Chromium Authors. All rights reserved. 1 // Copyright (c) 2012 The Chromium Authors. All rights reserved.
2 // Use of this source code is governed by a BSD-style license that can be 2 // Use of this source code is governed by a BSD-style license that can be
3 // found in the LICENSE file. 3 // found in the LICENSE file.
4 4
5 // Implementation of AudioOutputStream for Windows using Windows Core Audio 5 // Implementation of AudioOutputStream for Windows using Windows Core Audio
6 // WASAPI for low latency rendering. 6 // WASAPI for low latency rendering.
7 // 7 //
8 // Overview of operation and performance: 8 // Overview of operation and performance:
9 // 9 //
10 // - An object of WASAPIAudioOutputStream is created by the AudioManager 10 // - An object of WASAPIAudioOutputStream is created by the AudioManager
11 // factory. 11 // factory.
12 // - Next some thread will call Open(), at that point the underlying 12 // - Next some thread will call Open(), at that point the underlying
13 // Core Audio APIs are utilized to create two WASAPI interfaces called 13 // Core Audio APIs are utilized to create two WASAPI interfaces called
14 // IAudioClient and IAudioRenderClient. 14 // IAudioClient and IAudioRenderClient.
15 // - Then some thread will call Start(source). 15 // - Then some thread will call Start(source).
16 // A thread called "wasapi_render_thread" is started and this thread listens 16 // A thread called "wasapi_render_thread" is started and this thread listens
17 // on an event signal which is set periodically by the audio engine to signal 17 // on an event signal which is set periodically by the audio engine to signal
18 // render events. As a result, OnMoreData() will be called and the registered 18 // render events. As a result, OnMoreData() will be called and the registered
19 // client is then expected to provide data samples to be played out. 19 // client is then expected to provide data samples to be played out.
20 // - At some point, a thread will call Stop(), which stops and joins the 20 // - At some point, a thread will call Stop(), which stops and joins the
21 // render thread and at the same time stops audio streaming. 21 // render thread and at the same time stops audio streaming.
22 // - The same thread that called stop will call Close() where we cleanup 22 // - The same thread that called stop will call Close() where we cleanup
23 // and notify the audio manager, which likely will destroy this object. 23 // and notify the audio manager, which likely will destroy this object.
24 // - Initial tests on Windows 7 shows that this implementation results in a
25 // latency of approximately 35 ms if the selected packet size is less than
26 // or equal to 20 ms. Using a packet size of 10 ms does not result in a
27 // lower latency but only affects the size of the data buffer in each
28 // OnMoreData() callback.
29 // - A total typical delay of 35 ms contains three parts: 24 // - A total typical delay of 35 ms contains three parts:
30 // o Audio endpoint device period (~10 ms). 25 // o Audio endpoint device period (~10 ms).
31 // o Stream latency between the buffer and endpoint device (~5 ms). 26 // o Stream latency between the buffer and endpoint device (~5 ms).
32 // o Endpoint buffer (~20 ms to ensure glitch-free rendering). 27 // o Endpoint buffer (~20 ms to ensure glitch-free rendering).
33 // - Note that, if the user selects a packet size of e.g. 100 ms, the total
34 // delay will be approximately 115 ms (10 + 5 + 100).
35 // 28 //
36 // Implementation notes: 29 // Implementation notes:
37 // 30 //
38 // - The minimum supported client is Windows Vista. 31 // - The minimum supported client is Windows Vista.
39 // - This implementation is single-threaded, hence: 32 // - This implementation is single-threaded, hence:
40 // o Construction and destruction must take place from the same thread. 33 // o Construction and destruction must take place from the same thread.
41 // o All APIs must be called from the creating thread as well. 34 // o All APIs must be called from the creating thread as well.
42 // - It is recommended to first acquire the native sample rate of the default 35 // - It is required to first acquire the native audio parameters of the default
43 // input device and then use the same rate when creating this object. Use 36 // output device and then use the same rate when creating this object. Use
44 // WASAPIAudioOutputStream::HardwareSampleRate() to retrieve the sample rate. 37 // e.g. WASAPIAudioOutputStream::HardwareSampleRate() to retrieve the sample
38 // rate. Open() will fail unless "perfect" audio parameters are utilized.
45 // - Calling Close() also leads to self destruction. 39 // - Calling Close() also leads to self destruction.
46 // - Stream switching is not supported if the user shifts the audio device
47 // after Open() is called but before Start() has been called.
48 // - Stream switching can fail if streaming starts on one device with a
49 // supported format (X) and the new default device - to which we would like
50 // to switch - uses another format (Y), which is not supported given the
51 // configured audio parameters.
52 // - The audio device must be opened with the same number of channels as it
53 // supports natively (see HardwareChannelCount()) otherwise Open() will fail.
54 // - Support for 8-bit audio has not yet been verified and tested. 40 // - Support for 8-bit audio has not yet been verified and tested.
55 // 41 //
56 // Core Audio API details: 42 // Core Audio API details:
57 // 43 //
58 // - The public API methods (Open(), Start(), Stop() and Close()) must be 44 // - The public API methods (Open(), Start(), Stop() and Close()) must be
59 // called on constructing thread. The reason is that we want to ensure that 45 // called on constructing thread. The reason is that we want to ensure that
60 // the COM environment is the same for all API implementations. 46 // the COM environment is the same for all API implementations.
61 // - Utilized MMDevice interfaces: 47 // - Utilized MMDevice interfaces:
62 // o IMMDeviceEnumerator 48 // o IMMDeviceEnumerator
63 // o IMMDevice 49 // o IMMDevice
(...skipping 112 matching lines...) Expand 10 before | Expand all | Expand 10 after
176 // processing/mixing of shared-mode streams for the default endpoint device. 162 // processing/mixing of shared-mode streams for the default endpoint device.
177 int GetEndpointChannelCountForTesting() { return format_.Format.nChannels; } 163 int GetEndpointChannelCountForTesting() { return format_.Format.nChannels; }
178 164
179 private: 165 private:
180 // DelegateSimpleThread::Delegate implementation. 166 // DelegateSimpleThread::Delegate implementation.
181 virtual void Run() OVERRIDE; 167 virtual void Run() OVERRIDE;
182 168
183 // Issues the OnError() callback to the |sink_|. 169 // Issues the OnError() callback to the |sink_|.
184 void HandleError(HRESULT err); 170 void HandleError(HRESULT err);
185 171
186 // The Open() method is divided into these sub methods.
187 HRESULT SetRenderDevice();
188 HRESULT ActivateRenderDevice();
189 bool DesiredFormatIsSupported();
190 HRESULT InitializeAudioEngine();
191
192 // Called when the device will be opened in shared mode and use the
193 // internal audio engine's mix format.
194 HRESULT SharedModeInitialization();
195
196 // Called when the device will be opened in exclusive mode and use the 172 // Called when the device will be opened in exclusive mode and use the
197 // application specified format. 173 // application specified format.
198 HRESULT ExclusiveModeInitialization(); 174 // TODO(henrika): rewrite and move to CoreAudioUtil when removing flag
175 // for exclusive audio mode.
176 HRESULT ExclusiveModeInitialization(IAudioClient* client,
177 HANDLE event_handle,
178 size_t* endpoint_buffer_size);
199 179
200 // Converts unique endpoint ID to user-friendly device name. 180 // Fills up the endpoint rendering buffer with silence.
201 std::string GetDeviceName(LPCWSTR device_id) const; 181 bool FillEndpointBufferWithSilence(UINT32* num_written_frames);
202 182
203 // Contains the thread ID of the creating thread. 183 // Contains the thread ID of the creating thread.
204 base::PlatformThreadId creating_thread_id_; 184 base::PlatformThreadId creating_thread_id_;
205 185
206 // Our creator, the audio manager needs to be notified when we close. 186 // Our creator, the audio manager needs to be notified when we close.
207 AudioManagerWin* manager_; 187 AudioManagerWin* manager_;
208 188
209 // Rendering is driven by this thread (which has no message loop). 189 // Rendering is driven by this thread (which has no message loop).
210 // All OnMoreData() callbacks will be called from this thread. 190 // All OnMoreData() callbacks will be called from this thread.
211 scoped_ptr<base::DelegateSimpleThread> render_thread_; 191 scoped_ptr<base::DelegateSimpleThread> render_thread_;
212 192
213 // Contains the desired audio format which is set up at construction. 193 // Contains the desired audio format which is set up at construction.
214 // Extended PCM waveform format structure based on WAVEFORMATEXTENSIBLE. 194 // Extended PCM waveform format structure based on WAVEFORMATEXTENSIBLE.
215 // Use this for multiple channel and hi-resolution PCM data. 195 // Use this for multiple channel and hi-resolution PCM data.
216 WAVEFORMATPCMEX format_; 196 WAVEFORMATPCMEX format_;
217 197
218 // Copy of the audio format which we know the audio engine supports. 198 // Set to true when stream is successfully opened.
219 // It is recommended to ensure that the sample rate in |format_| is identical
220 // to the sample rate in |audio_engine_mix_format_|.
221 base::win::ScopedCoMem<WAVEFORMATPCMEX> audio_engine_mix_format_;
222
223 bool opened_; 199 bool opened_;
224 200
225 // Set to true as soon as a new default device is detected, and cleared when 201 // We check if the input audio parameters are identical (bit depth is
226 // the streaming has switched from using the old device to the new device. 202 // excluded) to the preferred (native) audio parameters during construction.
227 // All additional device detections during an active state are ignored to 203 // Open() will fail if |audio_parmeters_are_valid_| is false.
228 // ensure that the ongoing switch can finalize without disruptions. 204 bool audio_parmeters_are_valid_;
229 bool restart_rendering_mode_;
230 205
231 // Volume level from 0 to 1. 206 // Volume level from 0 to 1.
232 float volume_; 207 float volume_;
233 208
234 // Size in bytes of each audio frame (4 bytes for 16-bit stereo PCM).
235 size_t frame_size_;
236
237 // Size in audio frames of each audio packet where an audio packet 209 // Size in audio frames of each audio packet where an audio packet
238 // is defined as the block of data which the source is expected to deliver 210 // is defined as the block of data which the source is expected to deliver
239 // in each OnMoreData() callback. 211 // in each OnMoreData() callback.
240 size_t packet_size_frames_; 212 size_t packet_size_frames_;
241 213
242 // Size in bytes of each audio packet. 214 // Size in bytes of each audio packet.
243 size_t packet_size_bytes_; 215 size_t packet_size_bytes_;
244 216
245 // Size in milliseconds of each audio packet. 217 // Size in milliseconds of each audio packet.
246 float packet_size_ms_; 218 float packet_size_ms_;
247 219
248 // Length of the audio endpoint buffer. 220 // Length of the audio endpoint buffer.
249 size_t endpoint_buffer_size_frames_; 221 size_t endpoint_buffer_size_frames_;
250 222
251 // Defines the role that the system has assigned to an audio endpoint device. 223 // Defines the role that the system has assigned to an audio endpoint device.
252 ERole device_role_; 224 ERole device_role_;
253 225
254 // The sharing mode for the connection. 226 // The sharing mode for the connection.
255 // Valid values are AUDCLNT_SHAREMODE_SHARED and AUDCLNT_SHAREMODE_EXCLUSIVE 227 // Valid values are AUDCLNT_SHAREMODE_SHARED and AUDCLNT_SHAREMODE_EXCLUSIVE
256 // where AUDCLNT_SHAREMODE_SHARED is the default. 228 // where AUDCLNT_SHAREMODE_SHARED is the default.
257 AUDCLNT_SHAREMODE share_mode_; 229 AUDCLNT_SHAREMODE share_mode_;
258 230
259 // The channel count set by the client in |params| which is provided to the
260 // constructor. The client must feed the AudioSourceCallback::OnMoreData()
261 // callback with PCM-data that contains this number of channels.
262 int client_channel_count_;
263
264 // Counts the number of audio frames written to the endpoint buffer. 231 // Counts the number of audio frames written to the endpoint buffer.
265 UINT64 num_written_frames_; 232 UINT64 num_written_frames_;
266 233
267 // Pointer to the client that will deliver audio samples to be played out. 234 // Pointer to the client that will deliver audio samples to be played out.
268 AudioSourceCallback* source_; 235 AudioSourceCallback* source_;
269 236
270 // An IMMDeviceEnumerator interface which represents a device enumerator. 237 // An IMMDeviceEnumerator interface which represents a device enumerator.
271 base::win::ScopedComPtr<IMMDeviceEnumerator> device_enumerator_; 238 base::win::ScopedComPtr<IMMDeviceEnumerator> device_enumerator_;
272 239
273 // An IMMDevice interface which represents an audio endpoint device.
274 base::win::ScopedComPtr<IMMDevice> endpoint_device_;
275
276 // An IAudioClient interface which enables a client to create and initialize 240 // An IAudioClient interface which enables a client to create and initialize
277 // an audio stream between an audio application and the audio engine. 241 // an audio stream between an audio application and the audio engine.
278 base::win::ScopedComPtr<IAudioClient> audio_client_; 242 base::win::ScopedComPtr<IAudioClient> audio_client_;
279 243
280 // The IAudioRenderClient interface enables a client to write output 244 // The IAudioRenderClient interface enables a client to write output
281 // data to a rendering endpoint buffer. 245 // data to a rendering endpoint buffer.
282 base::win::ScopedComPtr<IAudioRenderClient> audio_render_client_; 246 base::win::ScopedComPtr<IAudioRenderClient> audio_render_client_;
283 247
284 // The audio engine will signal this event each time a buffer becomes 248 // The audio engine will signal this event each time a buffer becomes
285 // ready to be filled by the client. 249 // ready to be filled by the client.
286 base::win::ScopedHandle audio_samples_render_event_; 250 base::win::ScopedHandle audio_samples_render_event_;
287 251
288 // This event will be signaled when rendering shall stop. 252 // This event will be signaled when rendering shall stop.
289 base::win::ScopedHandle stop_render_event_; 253 base::win::ScopedHandle stop_render_event_;
290 254
291 // Container for retrieving data from AudioSourceCallback::OnMoreData(). 255 // Container for retrieving data from AudioSourceCallback::OnMoreData().
292 scoped_ptr<AudioBus> audio_bus_; 256 scoped_ptr<AudioBus> audio_bus_;
293 257
294 DISALLOW_COPY_AND_ASSIGN(WASAPIAudioOutputStream); 258 DISALLOW_COPY_AND_ASSIGN(WASAPIAudioOutputStream);
295 }; 259 };
296 260
297 } // namespace media 261 } // namespace media
298 262
299 #endif // MEDIA_AUDIO_WIN_AUDIO_LOW_LATENCY_OUTPUT_WIN_H_ 263 #endif // MEDIA_AUDIO_WIN_AUDIO_LOW_LATENCY_OUTPUT_WIN_H_
OLDNEW

Powered by Google App Engine
This is Rietveld 408576698