Chromium Code Reviews
chromiumcodereview-hr@appspot.gserviceaccount.com (chromiumcodereview-hr) | Please choose your nickname with Settings | Help | Chromium Project | Gerrit Changes | Sign out
(722)

Side by Side Diff: media/audio/win/audio_low_latency_output_win.h

Issue 8440002: Low-latency AudioOutputStream implementation based on WASAPI for Windows. (Closed) Base URL: svn://svn.chromium.org/chrome/trunk/src/
Patch Set: rebased Created 9 years, 1 month ago
Use n/p to move between diff chunks; N/P to move between comments. Draft comments are only viewable by you.
Jump to:
View unified diff | Download patch | Annotate | Revision Log
« no previous file with comments | « media/audio/audio_util.cc ('k') | media/audio/win/audio_low_latency_output_win.cc » ('j') | no next file with comments »
Toggle Intra-line Diffs ('i') | Expand Comments ('e') | Collapse Comments ('c') | Show Comments Hide Comments ('s')
Property Changes:
Added: svn:eol-style
+ LF
OLDNEW
(Empty)
1 // Copyright (c) 2011 The Chromium Authors. All rights reserved.
2 // Use of this source code is governed by a BSD-style license that can be
3 // found in the LICENSE file.
4 //
5 // Implementation of AudioOutputStream for Windows using Windows Core Audio
6 // WASAPI for low latency rendering.
7 //
8 // Overview of operation and performance:
9 //
10 // - An object of WASAPIAudioOutputStream is created by the AudioManager
11 // factory.
12 // - Next some thread will call Open(), at that point the underlying
13 // Core Audio APIs are utilized to create two WASAPI interfaces called
14 // IAudioClient and IAudioRenderClient.
15 // - Then some thread will call Start(source).
16 // A thread called "wasapi_render_thread" is started and this thread listens
17 // on an event signal which is set periodically by the audio engine to signal
18 // render events. As a result, OnMoreData() will be called and the registered
19 // client is then expected to provide data samples to be played out.
20 // - At some point, a thread will call Stop(), which stops and joins the
21 // render thread and at the same time stops audio streaming.
22 // - The same thread that called stop will call Close() where we cleanup
23 // and notify the audio manager, which likely will destroy this object.
24 // - Initial tests on Windows 7 shows that this implementation results in a
25 // latency of approximately 35 ms if the selected packet size is less than
26 // or equal to 20 ms. Using a packet size of 10 ms does not result in a
27 // lower latency but only affects the size of the data buffer in each
28 // OnMoreData() callback.
29 // - A total typical delay of 35 ms contains three parts:
30 // o Audio endpoint device period (~10 ms).
31 // o Stream latency between the buffer and endpoint device (~5 ms).
32 // o Endpoint buffer (~20 ms to ensure glitch-free rendering).
33 // - Note that, if the user selects a packet size of e.g. 100 ms, the total
34 // delay will be approximately 115 ms (10 + 5 + 100).
35 //
36 // Implementation notes:
37 //
38 // - The minimum supported client is Windows Vista.
39 // - This implementation is single-threaded, hence:
40 // o Construction and destruction must take place from the same thread.
41 // o All APIs must be called from the creating thread as well.
42 // - It is recommended to first acquire the native sample rate of the default
43 // input device and then use the same rate when creating this object. Use
44 // WASAPIAudioOutputStream::HardwareSampleRate() to retrieve the sample rate.
45 // - Calling Close() also leads to self destruction.
46 //
47 // Core Audio API details:
48 //
49 // - CoInitializeEx() is called on the creating thread and on the internal
50 // capture thread. Each thread's concurrency model and apartment is set
51 // to multi-threaded (MTA). CHECK() is called to ensure that we crash if
52 // CoInitializeEx(MTA) fails.
53 // - The public API methods (Open(), Start(), Stop() and Close()) must be
54 // called on constructing thread. The reason is that we want to ensure that
55 // the COM environment is the same for all API implementations.
56 // - Utilized MMDevice interfaces:
57 // o IMMDeviceEnumerator
58 // o IMMDevice
59 // - Utilized WASAPI interfaces:
60 // o IAudioClient
61 // o IAudioRenderClient
62 // - The stream is initialized in shared mode and the processing of the
63 // audio buffer is event driven.
64 // - The Multimedia Class Scheduler service (MMCSS) is utilized to boost
65 // the priority of the render thread.
66 // - Audio-rendering endpoint devices can have three roles:
67 // Console (eConsole), Communications (eCommunications), and Multimedia
68 // (eMultimedia). Search for "Device Roles" on MSDN for more details.
69 //
70 #ifndef MEDIA_AUDIO_WIN_AUDIO_LOW_LATENCY_OUTPUT_WIN_H_
71 #define MEDIA_AUDIO_WIN_AUDIO_LOW_LATENCY_OUTPUT_WIN_H_
72
73 #include <Audioclient.h>
74 #include <MMDeviceAPI.h>
75
76 #include "base/compiler_specific.h"
77 #include "base/threading/platform_thread.h"
78 #include "base/threading/simple_thread.h"
79 #include "base/win/scoped_co_mem.h"
80 #include "base/win/scoped_com_initializer.h"
81 #include "base/win/scoped_comptr.h"
82 #include "base/win/scoped_handle.h"
83 #include "media/audio/audio_io.h"
84 #include "media/audio/audio_parameters.h"
85 #include "media/base/media_export.h"
86
87 class AudioManagerWin;
88
89 // AudioOutputStream implementation using Windows Core Audio APIs.
90 class MEDIA_EXPORT WASAPIAudioOutputStream
91 : public AudioOutputStream,
92 public base::DelegateSimpleThread::Delegate {
93 public:
94 // The ctor takes all the usual parameters, plus |manager| which is the
95 // the audio manager who is creating this object.
96 WASAPIAudioOutputStream(AudioManagerWin* manager,
97 const AudioParameters& params,
98 ERole device_role);
99 // The dtor is typically called by the AudioManager only and it is usually
100 // triggered by calling AudioOutputStream::Close().
101 virtual ~WASAPIAudioOutputStream();
102
103 // Implementation of AudioOutputStream.
104 virtual bool Open() OVERRIDE;
105 virtual void Start(AudioSourceCallback* callback) OVERRIDE;
106 virtual void Stop() OVERRIDE;
107 virtual void Close() OVERRIDE;
108 virtual void SetVolume(double volume) OVERRIDE;
109 virtual void GetVolume(double* volume) OVERRIDE;
110
111 // Retrieves the stream format that the audio engine uses for its internal
112 // processing/mixing of shared-mode streams.
113 static double HardwareSampleRate(ERole device_role);
114
115 bool started() const { return started_; }
116
117 private:
118 // DelegateSimpleThread::Delegate implementation.
119 virtual void Run() OVERRIDE;
120
121 // Issues the OnError() callback to the |sink_|.
122 void HandleError(HRESULT err);
123
124 // The Open() method is divided into these sub methods.
125 HRESULT SetRenderDevice(ERole device_role);
126 HRESULT ActivateRenderDevice();
127 HRESULT GetAudioEngineStreamFormat();
128 bool DesiredFormatIsSupported();
129 HRESULT InitializeAudioEngine();
130
131 // Initializes the COM library for use by the calling thread and sets the
132 // thread's concurrency model to multi-threaded.
133 base::win::ScopedCOMInitializer com_init_;
134
135 // Contains the thread ID of the creating thread.
136 base::PlatformThreadId creating_thread_id_;
137
138 // Our creator, the audio manager needs to be notified when we close.
139 AudioManagerWin* manager_;
140
141 // Rendering is driven by this thread (which has no message loop).
142 // All OnMoreData() callbacks will be called from this thread.
143 base::DelegateSimpleThread* render_thread_;
144
145 // Contains the desired audio format which is set up at construction.
146 WAVEFORMATEX format_;
147
148 // Copy of the audio format which we know the audio engine supports.
149 // It is recommended to ensure that the sample rate in |format_| is identical
150 // to the sample rate in |audio_engine_mix_format_|.
151 base::win::ScopedCoMem<WAVEFORMATEX> audio_engine_mix_format_;
152
153 bool opened_;
154 bool started_;
155
156 // Volume level from 0 to 1.
157 float volume_;
158
159 // Size in bytes of each audio frame (4 bytes for 16-bit stereo PCM).
160 size_t frame_size_;
161
162 // Size in audio frames of each audio packet where an audio packet
163 // is defined as the block of data which the source is expected to deliver
164 // in each OnMoreData() callback.
165 size_t packet_size_frames_;
166
167 // Size in bytes of each audio packet.
168 size_t packet_size_bytes_;
169
170 // Size in milliseconds of each audio packet.
171 float packet_size_ms_;
172
173 // Length of the audio endpoint buffer.
174 size_t endpoint_buffer_size_frames_;
175
176 // Defines the role that the system has assigned to an audio endpoint device.
177 ERole device_role_;
178
179 // Counts the number of audio frames written to the endpoint buffer.
180 UINT64 num_written_frames_;
181
182 // Pointer to the client that will deliver audio samples to be played out.
183 AudioSourceCallback* source_;
184
185 // An IMMDevice interface which represents an audio endpoint device.
186 base::win::ScopedComPtr<IMMDevice> endpoint_device_;
187
188 // An IAudioClient interface which enables a client to create and initialize
189 // an audio stream between an audio application and the audio engine.
190 base::win::ScopedComPtr<IAudioClient> audio_client_;
191
192 // The IAudioRenderClient interface enables a client to write output
193 // data to a rendering endpoint buffer.
194 base::win::ScopedComPtr<IAudioRenderClient> audio_render_client_;
195
196 // The audio engine will signal this event each time a buffer becomes
197 // ready to be filled by the client.
198 base::win::ScopedHandle audio_samples_render_event_;
199
200 // This event will be signaled when rendering shall stop.
201 base::win::ScopedHandle stop_render_event_;
202
203 DISALLOW_COPY_AND_ASSIGN(WASAPIAudioOutputStream);
204 };
205
206 #endif // MEDIA_AUDIO_WIN_AUDIO_LOW_LATENCY_OUTPUT_WIN_H_
OLDNEW
« no previous file with comments | « media/audio/audio_util.cc ('k') | media/audio/win/audio_low_latency_output_win.cc » ('j') | no next file with comments »

Powered by Google App Engine
This is Rietveld 408576698