|
|
Created:
6 years, 11 months ago by no longer working on chromium Modified:
4 years, 10 months ago Reviewers:
tommi (sloooow) - chröme, hlundin-chromium, Jói, brucedawson, miu, henrika (OOO until Aug 14), perkj_chrome CC:
chromium-reviews, fischman+watch_chromium.org, jam, mcasas+watch_chromium.org, joi+watch-content_chromium.org, feature-media-reviews_chromium.org, darin-cc_chromium.org, wjia+watch_chromium.org Base URL:
svn://svn.chromium.org/chrome/trunk/src Visibility:
Public. |
DescriptionFeed the render data to MediaStreamAudioProcessor and used AudioBus in render callback.
This CL is mainly to feed the render data to MediaStreamAudioProcessor. But since it is better to have all the data format as media::AudioBus in chrome, so I changed the WebRtcAudioRendererSource::RenderData(int16*) to
WebRtcAudioRendererSource::RenderData(media::AudioBus*), and also make MediaStreamAudioProcessor use WebRtcAudioRendererSource interface to get the render data.
BUG=264611
TEST=content_unittests and AEC works when chrome ----enable-audio-track-processing
Committed: https://src.chromium.org/viewvc/chrome?view=rev&revision=252554
Patch Set 1 #Patch Set 2 : fixed the unittests, ready for review now. #
Total comments: 7
Patch Set 3 : addressed Per's comments. #
Total comments: 13
Patch Set 4 : addressed Yuri's comments. #Patch Set 5 : Fixed the win bots. #
Total comments: 21
Patch Set 6 : addressed Tommi's comments. #Patch Set 7 : added WebRtcPlayoutDataSource and its sink interfaces. #Patch Set 8 : fixed the bots. #
Total comments: 12
Patch Set 9 : check if aec is enabled in the unittest #Patch Set 10 : rebased and added check the thread check on the destructor #
Total comments: 12
Patch Set 11 : #Patch Set 12 : #
Total comments: 2
Patch Set 13 : comments were addressed #Patch Set 14 : rebased #Patch Set 15 : checked echo_control_mobile()->is_enabled()) for android and ios #
Total comments: 3
Messages
Total messages: 43 (4 generated)
Tommi, could you please take a lead to review the CL? Per and Joi, it will be great if you guys also take a look at it. But if you don't have time, it is fine to just ignore it. Thanks, SX
Yuri, FYI, I take your suggestions on the render side. BTW, would you like to be more active on reviewing the audio change related to media stream? SX
Just some nit comments. I am afraid I don't know the audio path well enough. https://codereview.chromium.org/139303016/diff/30001/content/renderer/media/w... File content/renderer/media/webrtc_audio_device_impl.h (right): https://codereview.chromium.org/139303016/diff/30001/content/renderer/media/w... content/renderer/media/webrtc_audio_device_impl.h:233: //Note that this class inherits from webrtc::AudioDeviceModule but due to nit indentation https://codereview.chromium.org/139303016/diff/30001/content/renderer/media/w... content/renderer/media/webrtc_audio_device_impl.h:306: void AddRenderDataObserver(WebRtcAudioRendererSource* observer); I am a bit confused about the audio naming convention. What is a RenderDataObserver and why is a WebRtcAudioRendererSource and observer of it? https://codereview.chromium.org/139303016/diff/30001/content/renderer/media/w... content/renderer/media/webrtc_audio_device_impl.h:373: // List of observers which requires access to the render data. render data? Is this the audio that will be played out to a speaker normally by for ex an audio tag? https://codereview.chromium.org/139303016/diff/30001/content/renderer/media/w... content/renderer/media/webrtc_audio_device_impl.h:395: // ?
Thanks Per, PLAL. And also a gentle ping to other reviewers. SX https://codereview.chromium.org/139303016/diff/30001/content/renderer/media/w... File content/renderer/media/webrtc_audio_device_impl.h (right): https://codereview.chromium.org/139303016/diff/30001/content/renderer/media/w... content/renderer/media/webrtc_audio_device_impl.h:233: //Note that this class inherits from webrtc::AudioDeviceModule but due to On 2014/01/20 12:29:35, perkj wrote: > nit indentation Done. https://codereview.chromium.org/139303016/diff/30001/content/renderer/media/w... content/renderer/media/webrtc_audio_device_impl.h:306: void AddRenderDataObserver(WebRtcAudioRendererSource* observer); On 2014/01/20 12:29:35, perkj wrote: > I am a bit confused about the audio naming convention. What is a > RenderDataObserver and why is a WebRtcAudioRendererSource and observer of it? Do you have any other suggestion on the naming here? APM is the observer to the render data. And I am re-using WebRtcAudioRendererSource::RenderData to pass the render data to the APMs here. Another alternative for me is to add an interface like: WebRtcRenderDataObserver { public: virtual void RenderData(media::AudioBus* audio_bus, int sample_rate, int delay_ms) = 0; } which is very similar to WebRtcAudioRendererSource https://codereview.chromium.org/139303016/diff/30001/content/renderer/media/w... content/renderer/media/webrtc_audio_device_impl.h:373: // List of observers which requires access to the render data. On 2014/01/20 12:29:35, perkj wrote: > render data? Is this the audio that will be played out to a speaker normally by > for ex an audio tag? Yes.
Apologies for the delay in responding. On the whole, this is really nice! I'm not entirely familiar with the overall ACM design, so I'll leave that up to the other reviewers. My comments are mostly code style... https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor.h (right): https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.h:47: WebRtcAudioDeviceImpl* audio_device); Please comment this ctor to note that |audio_device| is optional (and what behavior does that change?). https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.h:47: WebRtcAudioDeviceImpl* audio_device); Also, should you be using WebRtcAudioRendererSource* or WebRtcAudioDeviceImpl* as the pointer type here? It looks like the interface type would be sufficient. https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.h:138: WebRtcAudioDeviceImpl* audio_device_; nit: Should be: WebRtcAudioDeviceImpl* const audio_device_; Since the member is never changed after construction. https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor_unittest.cc (right): https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/media_stream_audio_processor_unittest.cc:84: static_cast<WebRtcAudioRendererSource*>(audio_processor)->RenderData( This threw me off a bit. I see that you're trying to call a private method of MediaStreamAudioProcessor here. I'll leave it up to you, but IMHO it's more readable to make MediaStreamAudioProcessorTest a friend class of MediaStreamAudioProcessor, which would allow you to invoke RenderData() without the static_cast<> here. https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.cc:128: if (!render_buffer_ || You could simplify a lot here if render_buffer_ were a std::vector<int16>. These 5 lines of code would become one: render_buffer_.resize(audio_bus->frames() * audio_bus->channels()); https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.cc:142: int bytes_per_sample = 2; Replace 2 with sizeof(int16). https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.cc:382: *available = renderer_ ? (renderer_->channels() == 2) : false; For my own curiosity: What if renderer_->channels() is more than 2 (e.g., 5.1 sound)?
Hi Yuri, Thanks very much for the review, and there is no problem on the waiting. I think I have already addressed all your comments, could you please take a look? BTW, your review comments are great. BR, SX https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor.h (right): https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.h:47: WebRtcAudioDeviceImpl* audio_device); On 2014/01/24 22:27:19, miu wrote: > Please comment this ctor to note that |audio_device| is optional (and what > behavior does that change?). Done with adding a comment to explain what |audio_device| is used for. |audio_device| is used to register getting the webrtc render data for AEC. So it won't NULL for production code, but for some unittests, it can be NULL. https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.h:47: WebRtcAudioDeviceImpl* audio_device); On 2014/01/24 22:27:19, miu wrote: > Also, should you be using WebRtcAudioRendererSource* or WebRtcAudioDeviceImpl* > as the pointer type here? It looks like the interface type would be sufficient. audio_device_->AddRenderDataObserver() and audio_device_->RemoveRenderDataObserver() are not interface to WebRtcAudioRendererSource, but methods to WebRtcAudioDeviceImpl. Do you want to me change them into interfaces? https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.h:138: WebRtcAudioDeviceImpl* audio_device_; On 2014/01/24 22:27:19, miu wrote: > nit: Should be: > > WebRtcAudioDeviceImpl* const audio_device_; > > Since the member is never changed after construction. Done. https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor_unittest.cc (right): https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/media_stream_audio_processor_unittest.cc:84: static_cast<WebRtcAudioRendererSource*>(audio_processor)->RenderData( On 2014/01/24 22:27:19, miu wrote: > This threw me off a bit. I see that you're trying to call a private method of > MediaStreamAudioProcessor here. I'll leave it up to you, but IMHO it's more > readable to make MediaStreamAudioProcessorTest a friend class of > MediaStreamAudioProcessor, which would allow you to invoke RenderData() without > the static_cast<> here. Actually MediaStreamAudioProcessorTest had been made a friend class of MediaStreamAudioProcessor in my another CL. So I removed the static_cast after rebasing. https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.cc:142: int bytes_per_sample = 2; On 2014/01/24 22:27:19, miu wrote: > Replace 2 with sizeof(int16). Done with changing it to sizeof(render_buffer_[0]); https://codereview.chromium.org/139303016/diff/110001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.cc:382: *available = renderer_ ? (renderer_->channels() == 2) : false; On 2014/01/24 22:27:19, miu wrote: > For my own curiosity: What if renderer_->channels() is more than 2 (e.g., 5.1 > sound)? The playout of the webrtc only supports mono and stereo so far.
lgtm I'm a little concerned about the auto-adjustment of microphone volume. I'm not sure whether this has been fixed elsewhere, but there used to be no bound on how fast or how often the microphone volume setting would be changed (the renderer could flood the browser process with IPCs); and whether this feedback loop couldn't produce harmful unintended behavior. Hopefully, you guys have an answer to these problems in your new design, going forward.
On 2014/01/27 23:46:47, miu wrote: > lgtm > > I'm a little concerned about the auto-adjustment of microphone volume. I'm not > sure whether this has been fixed elsewhere, but there used to be no bound on how > fast or how often the microphone volume setting would be changed (the renderer > could flood the browser process with IPCs); and whether this feedback loop > couldn't produce harmful unintended behavior. Hopefully, you guys have an > answer to these problems in your new design, going forward. I heard from the AudioProcessing team that the AGC would be adjusted maximum once a second. But I can double check with them to see if we have any potential problem there.
I know the rest of you guys are pretty busy, and probably I don't need such a list of reviewers for this small CL, it might be sufficient to have one more reviewer to look at it. I am going to ping you guys offline to see which one has time. Thanks, SX
https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor_unittest.cc (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/media_stream_audio_processor_unittest.cc:83: if (audio_processor->has_audio_processing()) { nit: no {} https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.cc:166: // Pass the render data to the observers. this is terminology again but are observers in this case more like sinks (i.e. destinations for audio data)? https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.cc:377: *available = renderer_ ? (renderer_->channels() == 2) : false; nit: can also do this *available = renderer_ && renderer_->channels() == 2; https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.cc:475: DCHECK(observer); nice to see the dchecks on observer and std::find. https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/webrtc_audio_device_impl.h (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.h:188: // Callback to get the rendered interleaved data. the data isn't interleaved anymore, right? https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.h:306: void AddRenderDataObserver(WebRtcAudioRendererSource* observer); I'm a bit confused about this... WebRtcAudioDeviceImpl is a WebRtcAudioRendererSource and this adds a way for a WebRtcAudioRendererSource to observe what an WebRtcAudioRendererSource is doing? If this is a generic way to chain together WebRtcAudioRendererSource instances, should these methods then be a part of the WebRtcAudioRendererSource interface? https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.h:325: typedef std::list<WebRtcAudioRendererSource* > RenderDataObservers; no space before > Also document lifetime/ownership management. https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.h:374: RenderDataObservers render_data_observers_; What's the difference between "render data" and "data" or even "audio" (or "audio data")? Is the concept of 'render' needed to explain the context of at what point the callback is made? (since 'rendering' is an action that's performed on data, this throws me off a bit) https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/webrtc_audio_renderer.cc (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_renderer.cc:466: return; this return statement is not needed now https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/webrtc_audio_renderer.h (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_renderer.h:130: // the renderer. now that these are no longer const, can you document on which thread they are allowed to be touched (and any locking that they might need).
Tommi, PTAL. Thanks, SX https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor_unittest.cc (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/media_stream_audio_processor_unittest.cc:83: if (audio_processor->has_audio_processing()) { On 2014/01/31 13:58:32, tommi wrote: > nit: no {} Done. https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.cc:166: // Pass the render data to the observers. On 2014/01/31 13:58:32, tommi wrote: > this is terminology again but are observers in this case more like sinks (i.e. > destinations for audio data)? Yes, it is. I have already explained in the header, if you would like to have new interface for the observing the rendered data, it is completely fine to me. https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.cc:377: *available = renderer_ ? (renderer_->channels() == 2) : false; On 2014/01/31 13:58:32, tommi wrote: > nit: can also do this > *available = renderer_ && renderer_->channels() == 2; Done. https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/webrtc_audio_device_impl.h (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.h:188: // Callback to get the rendered interleaved data. On 2014/01/31 13:58:32, tommi wrote: > the data isn't interleaved anymore, right? Done. https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.h:306: void AddRenderDataObserver(WebRtcAudioRendererSource* observer); On 2014/01/31 13:58:32, tommi wrote: > I'm a bit confused about this... WebRtcAudioDeviceImpl is a > WebRtcAudioRendererSource and this adds a way for a WebRtcAudioRendererSource to > observe what an WebRtcAudioRendererSource is doing? > > If this is a generic way to chain together WebRtcAudioRendererSource instances, > should these methods then be a part of the WebRtcAudioRendererSource interface? This solution is an intermedia step to get the rendered data. Today, there is only one WebRtcAudioRendererSource, so the WebRtcAudioDeviceImpl is the only option for the observers to get the rendered data from. In the current version, I am reusing the WebRtcAudioRendererSource interface to allow the observers to get the rendered data, so Add/RemoveRenderDataObserver can't be interfaces in WebRtcAudioRendererSource. If we want to put these two methods to interfaces, I need to add new interfaces like: WebRtcAudioRenderedDataHost { // Implement by MediaStreamAudioProcessor. class WebRtcAudioRenderedData { virtual void RenderData(media::AudioBus* audio_bus, int sample_rate, int delay_ms) = 0; } // Implement by WebRtcAudioDeviceImpl. virtual void AddRenderDataObserver(WebRtcAudioRendererSource* observer) = 0; virtual void AddRenderDataObserver(WebRtcAudioRendererSource* observer) = 0; } I am actually fine with both, please let me know which solution you prefer. https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.h:325: typedef std::list<WebRtcAudioRendererSource* > RenderDataObservers; On 2014/01/31 13:58:32, tommi wrote: > no space before > > > Also document lifetime/ownership management. Done with adding comment in line 379, where the RenderDataObservers is being used. https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.h:374: RenderDataObservers render_data_observers_; On 2014/01/31 13:58:32, tommi wrote: > What's the difference between "render data" and "data" or even "audio" (or > "audio data")? Is the concept of 'render' needed to explain the context of at > what point the callback is made? (since 'rendering' is an action that's > performed on data, this throws me off a bit) Probably it should be rendered data. This WebRtcAudioDeviceImpl is being used by both capture side and render side, so the documents have been made explicitly to tell which side the members are used for. Done with changing it to rendered data. https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/webrtc_audio_renderer.cc (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_renderer.cc:466: return; On 2014/01/31 13:58:32, tommi wrote: > this return statement is not needed now Done. https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/webrtc_audio_renderer.h (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_renderer.h:130: // the renderer. On 2014/01/31 13:58:32, tommi wrote: > now that these are no longer const, can you document on which thread they are > allowed to be touched (and any locking that they might need). Added a comment to explain they are modified only in the Initialize() on the main render thread.
a gentle ping, Tommi, do you have time to take another look? Thanks, SX
Just some terminology suggestions. https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.cc:166: // Pass the render data to the observers. On 2014/02/02 16:50:16, xians1 wrote: > On 2014/01/31 13:58:32, tommi wrote: > > this is terminology again but are observers in this case more like sinks (i.e. > > destinations for audio data)? > > Yes, it is. I have already explained in the header, if you would like to have > new interface for the observing the rendered data, it is completely fine to me. Observers aren't participants, so this is not an observer. Here's more information about the observer pattern: http://en.wikipedia.org/wiki/Observer_pattern https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... File content/renderer/media/webrtc_audio_device_impl.h (right): https://codereview.chromium.org/139303016/diff/430001/content/renderer/media/... content/renderer/media/webrtc_audio_device_impl.h:306: void AddRenderDataObserver(WebRtcAudioRendererSource* observer); On 2014/02/02 16:50:16, xians1 wrote: > On 2014/01/31 13:58:32, tommi wrote: > > I'm a bit confused about this... WebRtcAudioDeviceImpl is a > > WebRtcAudioRendererSource and this adds a way for a WebRtcAudioRendererSource > to > > observe what an WebRtcAudioRendererSource is doing? > > > > If this is a generic way to chain together WebRtcAudioRendererSource > instances, > > should these methods then be a part of the WebRtcAudioRendererSource > interface? > > This solution is an intermedia step to get the rendered data. > Today, there is only one WebRtcAudioRendererSource, so the WebRtcAudioDeviceImpl > is the only option for the observers to get the rendered data from. > > In the current version, I am reusing the WebRtcAudioRendererSource interface to > allow the observers to get the rendered data, so Add/RemoveRenderDataObserver > can't be interfaces in WebRtcAudioRendererSource. > > If we want to put these two methods to interfaces, I need to add new interfaces > like: > WebRtcAudioRenderedDataHost { > // Implement by MediaStreamAudioProcessor. > class WebRtcAudioRenderedData { > virtual void RenderData(media::AudioBus* audio_bus, int sample_rate, int > delay_ms) = 0; > } > > // Implement by WebRtcAudioDeviceImpl. > virtual void AddRenderDataObserver(WebRtcAudioRendererSource* observer) = 0; > virtual void AddRenderDataObserver(WebRtcAudioRendererSource* observer) = 0; > > } > > I am actually fine with both, please let me know which solution you prefer. It sounds like what you want is a sink to a source and not an observer.
Tommi, I added new interfaces of WebRtcPlayoutDataSource and its sink, as we discussed offline. PTAL and I would like to land this ASAP so that I can enable the APM in Chrome. SX
Another gentle ping.
sorry for the slow response https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor.cc (left): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.cc:154: DCHECK(main_thread_checker_.CalledOnValidThread()); why removing? (it seems like a good thing to make sure the object is destructed on the same thread it was constructed on) https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor.cc (right): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.cc:192: DCHECK(audio_processing_->echo_cancellation()->is_enabled()); is this correct? will we only ever get this callback if aec is enabled? What if only agc is turned on etc? previously there was a check for it that handled it as a supported case but having a DCHECK means this is a state that should be impossible for us to reach. https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor.h (left): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.h:147: base::ThreadChecker main_thread_checker_; why are we removing this? https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor_unittest.cc (right): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor_unittest.cc:83: if (audio_processor->has_audio_processing()) {} Since you check for has_audio_processing() here, then that means that aec might *not* be enabled even though other audio processing is performed. It might also be possible that hw aec is being used so the APM's AEC is actually disabled. So, I'm not sure about that DCHECK in the other file.
Thanks, PTAL. https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor.cc (left): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.cc:154: DCHECK(main_thread_checker_.CalledOnValidThread()); On 2014/02/06 16:47:01, tommi wrote: > why removing? > (it seems like a good thing to make sure the object is destructed on the same > thread it was constructed on) MediaStreamAudioProcessor is reference counted, and the audio thread can grab a reference and release it there. Normally this is not a problem since we usually stop the audio thread before we destuct the MediaStreamAudioProcessor. But I am afraid we don't really guarantee it. I removed this thread check since it does not matter in which thread this destruction happens. https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor.cc (right): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.cc:192: DCHECK(audio_processing_->echo_cancellation()->is_enabled()); On 2014/02/06 16:47:01, tommi wrote: > is this correct? will we only ever get this callback if aec is enabled? What > if only agc is turned on etc? > > previously there was a check for it that handled it as a supported case but > having a DCHECK means this is a state that should be impossible for us to reach. the playout data is only needed when AEC is enabled, and it does nothing to AGC or any other component. In Line 264, we add this processor as sink for the playout data only when the AEC is enabled. https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor.h (left): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.h:147: base::ThreadChecker main_thread_checker_; On 2014/02/06 16:47:01, tommi wrote: > why are we removing this? Please see the comment in the cc file. https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor_unittest.cc (right): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor_unittest.cc:83: if (audio_processor->has_audio_processing()) On 2014/02/06 16:47:01, tommi wrote: > {} > > Since you check for has_audio_processing() here, then that means that aec might > *not* be enabled even though other audio processing is performed. > It might also be possible that hw aec is being used so the APM's AEC is actually > disabled. So, I'm not sure about that DCHECK in the other file. The current unittests only contains the test cases where we either turn on all the components or turn off all the components. That is why checking has_audio_processing() is sufficient today because we don't have test cases where AEC is diabled while other components are enabled. Anyway, it might be better to check if AEC is enabled or not here since it generates question. Done with changing it.
just one more check about the ctor/dtor checks https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor.cc (left): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.cc:154: DCHECK(main_thread_checker_.CalledOnValidThread()); On 2014/02/06 17:20:14, xians1 wrote: > On 2014/02/06 16:47:01, tommi wrote: > > why removing? > > (it seems like a good thing to make sure the object is destructed on the same > > thread it was constructed on) > > MediaStreamAudioProcessor is reference counted, and the audio thread can grab a > reference and release it there. > > Normally this is not a problem since we usually stop the audio thread before we > destuct the MediaStreamAudioProcessor. But I am afraid we don't really guarantee > it. > I removed this thread check since it does not matter in which thread this > destruction happens. Can we move construction to the audio thread then as well? https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor.cc (right): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.cc:192: DCHECK(audio_processing_->echo_cancellation()->is_enabled()); On 2014/02/06 17:20:14, xians1 wrote: > On 2014/02/06 16:47:01, tommi wrote: > > is this correct? will we only ever get this callback if aec is enabled? What > > if only agc is turned on etc? > > > > previously there was a check for it that handled it as a supported case but > > having a DCHECK means this is a state that should be impossible for us to > reach. > > the playout data is only needed when AEC is enabled, and it does nothing to AGC > or any other component. > In Line 264, we add this processor as sink for the playout data only when the > AEC is enabled. OK, cool. https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor_unittest.cc (right): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor_unittest.cc:83: if (audio_processor->has_audio_processing()) On 2014/02/06 17:20:14, xians1 wrote: > On 2014/02/06 16:47:01, tommi wrote: > > {} > > > > Since you check for has_audio_processing() here, then that means that aec > might > > *not* be enabled even though other audio processing is performed. > > It might also be possible that hw aec is being used so the APM's AEC is > actually > > disabled. So, I'm not sure about that DCHECK in the other file. > > The current unittests only contains the test cases where we either turn on all > the components or turn off all the components. > > That is why checking has_audio_processing() is sufficient today because we don't > have test cases where AEC is diabled while other components are enabled. > > Anyway, it might be better to check if AEC is enabled or not here since it > generates question. > > Done with changing it. thanks
Tommi, I added back the thread check, PATL. FYI, I am going to land this after the cut. SX https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... File content/renderer/media/media_stream_audio_processor.cc (left): https://codereview.chromium.org/139303016/diff/880001/content/renderer/media/... content/renderer/media/media_stream_audio_processor.cc:154: DCHECK(main_thread_checker_.CalledOnValidThread()); On 2014/02/06 17:58:44, tommi wrote: > On 2014/02/06 17:20:14, xians1 wrote: > > On 2014/02/06 16:47:01, tommi wrote: > > > why removing? > > > (it seems like a good thing to make sure the object is destructed on the > same > > > thread it was constructed on) > > > > MediaStreamAudioProcessor is reference counted, and the audio thread can grab > a > > reference and release it there. > > > > Normally this is not a problem since we usually stop the audio thread before > we > > destuct the MediaStreamAudioProcessor. But I am afraid we don't really > guarantee > > it. > > I removed this thread check since it does not matter in which thread this > > destruction happens. > > Can we move construction to the audio thread then as well? WebRtcAudioCapturer today creates/deletes its resources in the main render thread since AudioInputDevice needs to be created/deleted on the main render thread. And I would like to do this to all the resources instead of splitting them into parts. I removed the thread check here since I thought it was not a problem to have the destruction happens in another thread. But since this introduces concerns, I added it back.
https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/media_stream_audio_processor.h (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/media_stream_audio_processor.h:44: // |audio_device| is used to register this class as observer to the WebRtc audio_device doesn't seem to be a parameter... do you mean |playout_data_source|? and if so then s/observer/sink? https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/media_stream_audio_processor_unittest.cc (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/media_stream_audio_processor_unittest.cc:85: const bool is_aec_enabled = ap && ap->echo_control_mobile()->is_enabled(); out of curiosity, what's the difference between echo_control_mobile() and echo_cancellation()? https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/webrtc_audio_device_impl.cc:473: void WebRtcAudioDeviceImpl::AddPlayoutSink( do you know on which thread these methods are called? It's not clear to me what the threading model is and why the locking is needed. Would be good to document and perhaps it will change in the future so that we don't need to lock as much? https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/webrtc_audio_renderer.h (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/webrtc_audio_renderer.h:220: // The sample rate, number of channels and buffer sizes used by the sink of should we just store a |const AudioParameters| here?
https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/media_stream_audio_processor.h (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/media_stream_audio_processor.h:44: // |audio_device| is used to register this class as observer to the WebRtc On 2014/02/17 15:03:44, tommi wrote: > audio_device doesn't seem to be a parameter... do you mean > |playout_data_source|? and if so then s/observer/sink? Done. https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/media_stream_audio_processor_unittest.cc (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/media_stream_audio_processor_unittest.cc:85: const bool is_aec_enabled = ap && ap->echo_control_mobile()->is_enabled(); On 2014/02/17 15:03:44, tommi wrote: > out of curiosity, what's the difference between echo_control_mobile() and > echo_cancellation()? echo_control_mobile() is the AEC used by mobile, and echo_cancellation() is the AEC used by desktop. They are different components and have different interfaces. https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/webrtc_audio_device_impl.cc:473: void WebRtcAudioDeviceImpl::AddPlayoutSink( On 2014/02/17 15:03:44, tommi wrote: > do you know on which thread these methods are called? > It's not clear to me what the threading model is and why the locking is needed. > Would be good to document and perhaps it will change in the future so that we > don't need to lock as much? they will be always called by the main render thread, I have already added a thread check to the code. But we will always need the lock, since playout_sinks_ will be modified on the main render thread and used in the real-time render audio thread. https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/webrtc_audio_renderer.h (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/webrtc_audio_renderer.h:220: // The sample rate, number of channels and buffer sizes used by the sink of On 2014/02/17 15:03:44, tommi wrote: > should we just store a |const AudioParameters| here? This can't be a const since it will be modified in the Initialize(), but it can be a AudioParameters sink_params_; Do you want me to change it?
lgtm. I'll drop by your desk for the is_enabled() question. I assume there's a simple answer to that. https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/media_stream_audio_processor_unittest.cc (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/media_stream_audio_processor_unittest.cc:85: const bool is_aec_enabled = ap && ap->echo_control_mobile()->is_enabled(); On 2014/02/17 17:15:32, xians1 wrote: > On 2014/02/17 15:03:44, tommi wrote: > > out of curiosity, what's the difference between echo_control_mobile() and > > echo_cancellation()? > > echo_control_mobile() is the AEC used by mobile, and echo_cancellation() is the > AEC used by desktop. They are different components and have different > interfaces. What I'm trying to understand is the difference between the two. Will echo_cancellation() return NULL on Android? If not, will echo_cancellation()->is_enabled() be a different value than echo_control_mobile()->is_enabled() on android? https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/webrtc_audio_renderer.h (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/webrtc_audio_renderer.h:220: // The sample rate, number of channels and buffer sizes used by the sink of On 2014/02/17 17:15:32, xians1 wrote: > On 2014/02/17 15:03:44, tommi wrote: > > should we just store a |const AudioParameters| here? > > This can't be a const since it will be modified in the Initialize(), but it can > be a AudioParameters sink_params_; > Do you want me to change it? yeah since we're already storing the same parameters AudioParameters already has and using them to construct an AudioParameters instance, we might as well do that. https://codereview.chromium.org/139303016/diff/1320001/content/renderer/media... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/1320001/content/renderer/media... content/renderer/media/webrtc_audio_device_impl.cc:461: WebRtcAudioDeviceImpl::GetDefaultCapturer() const { if this method is also called on the same thread as Add/Remove PlayoutSink and RemoveAudioCapturer, please also add a thread check here (or add a comment that explains what thread we're on here).
https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/media_stream_audio_processor_unittest.cc (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/media_stream_audio_processor_unittest.cc:85: const bool is_aec_enabled = ap && ap->echo_control_mobile()->is_enabled(); On 2014/02/18 12:54:12, tommi wrote: > On 2014/02/17 17:15:32, xians1 wrote: > > On 2014/02/17 15:03:44, tommi wrote: > > > out of curiosity, what's the difference between echo_control_mobile() and > > > echo_cancellation()? > > > > echo_control_mobile() is the AEC used by mobile, and echo_cancellation() is > the > > AEC used by desktop. They are different components and have different > > interfaces. > > What I'm trying to understand is the difference between the two. > Will echo_cancellation() return NULL on Android? > If not, will echo_cancellation()->is_enabled() be a different value than > echo_control_mobile()->is_enabled() on android? Comment has been addressed offline, the answer is the question is "yes". I have already added a DCHECK(!ap || !ap->echo_cancellation()->is_enabled()); https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... File content/renderer/media/webrtc_audio_renderer.h (right): https://codereview.chromium.org/139303016/diff/1190001/content/renderer/media... content/renderer/media/webrtc_audio_renderer.h:220: // The sample rate, number of channels and buffer sizes used by the sink of On 2014/02/18 12:54:12, tommi wrote: > On 2014/02/17 17:15:32, xians1 wrote: > > On 2014/02/17 15:03:44, tommi wrote: > > > should we just store a |const AudioParameters| here? > > > > This can't be a const since it will be modified in the Initialize(), but it > can > > be a AudioParameters sink_params_; > > Do you want me to change it? > > yeah since we're already storing the same parameters AudioParameters already has > and using them to construct an AudioParameters instance, we might as well do > that. Done. https://codereview.chromium.org/139303016/diff/1320001/content/renderer/media... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/1320001/content/renderer/media... content/renderer/media/webrtc_audio_device_impl.cc:461: WebRtcAudioDeviceImpl::GetDefaultCapturer() const { On 2014/02/18 12:54:12, tommi wrote: > if this method is also called on the same thread as Add/Remove PlayoutSink and > RemoveAudioCapturer, NO. This method can be called by both libjingle thread and main render thread. please also add a thread check here (or add a comment that > explains what thread we're on here). Done with adding a comment in the header file.
The CQ bit was checked by xians@chromium.org
CQ is trying da patch. Follow status at https://chromium-status.appspot.com/cq/xians@chromium.org/139303016/1490001
The CQ bit was unchecked by xians@chromium.org
The CQ bit was checked by xians@chromium.org
CQ is trying da patch. Follow status at https://chromium-status.appspot.com/cq/xians@chromium.org/139303016/1800001
Message was sent while issue was closed.
Change committed as 252554
Message was sent while issue was closed.
brucedawson@chromium.org changed reviewers: + brucedawson@chromium.org
Message was sent while issue was closed.
Apologies for commenting on a CL two years after it was submitted, but /analyze 2015 found something suspicious about this change. Anybody qualified to comment on this? https://codereview.chromium.org/139303016/diff/1800001/content/renderer/media... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/1800001/content/renderer/media... content/renderer/media/webrtc_audio_device_impl.cc:147: int16* audio_data = &render_buffer_[0]; This CL changes audio_data from uint8* to int16* but line 157 still adds "bytes_per_10_msec" to audio_data. Is the code incorrect, or is the name "bytes_per_10_msec" incorrect? This was identified as suspicious during testing of VS 2015's /analyze feature.
Message was sent while issue was closed.
tommi@chromium.org changed reviewers: + henrika@chromium.org
Message was sent while issue was closed.
https://codereview.chromium.org/139303016/diff/1800001/content/renderer/media... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/1800001/content/renderer/media... content/renderer/media/webrtc_audio_device_impl.cc:147: int16* audio_data = &render_buffer_[0]; On 2016/01/30 18:32:07, brucedawson wrote: > This CL changes audio_data from uint8* to int16* but line 157 still adds > "bytes_per_10_msec" to audio_data. Is the code incorrect, or is the name > "bytes_per_10_msec" incorrect? > > This was identified as suspicious during testing of VS 2015's /analyze feature. +Henrik A. Thanks Bruce, I think you've found a bug. Henrik, can you take a look at this and see when and how frequently we have a case of this loop repeating? I think it usually just executes once (i.e. no loop) and that could explain why we haven't noticed this.
Message was sent while issue was closed.
Sure, will take a look and get back.
Message was sent while issue was closed.
https://codereview.chromium.org/139303016/diff/1800001/content/renderer/media... File content/renderer/media/webrtc_audio_device_impl.cc (right): https://codereview.chromium.org/139303016/diff/1800001/content/renderer/media... content/renderer/media/webrtc_audio_device_impl.cc:147: int16* audio_data = &render_buffer_[0]; Added DCHECK_EQ(accumulated_audio_frames, audio_bus->frames()) << "OOOOOPS"; after accumulated_audio_samples += num_audio_samples to see if I could hit a case where the while loop would run more than once. I was not able to trigger this DCHECK (built for Debug). Tried on Mac OS X. Tried two different sample rates and the following clients: - Peerconnection audio loopback. - AppRTC loopback - Test RTC - Hangout Hence, I am unable to trigger a case where a while loop is needed.
Message was sent while issue was closed.
On 2016/02/01 14:46:46, henrika wrote: > https://codereview.chromium.org/139303016/diff/1800001/content/renderer/media... > File content/renderer/media/webrtc_audio_device_impl.cc (right): > > https://codereview.chromium.org/139303016/diff/1800001/content/renderer/media... > content/renderer/media/webrtc_audio_device_impl.cc:147: int16* audio_data = > &render_buffer_[0]; > Added > > DCHECK_EQ(accumulated_audio_frames, audio_bus->frames()) << "OOOOOPS"; > > after accumulated_audio_samples += num_audio_samples to see if I could hit a > case where the while loop would run more than once. I was not able to trigger > this DCHECK (built for Debug). > > Tried on Mac OS X. Tried two different sample rates and the following clients: > > - Peerconnection audio loopback. > - AppRTC loopback > - Test RTC > - Hangout > > Hence, I am unable to trigger a case where a while loop is needed. OK, that's good news. I took a look at the code in WebRTC and from the looks of things, looping should never be needed. There's an assert here: https://code.google.com/p/chromium/codesearch#chromium/src/third_party/webrtc... |assert(number_of_frames == audioFrame_.samples_per_channel_);| Where |number_of_frames| is equivalent to |samples_per_10_msec| here and |audioFrame_.samples_per_channel_| is what will be assigned to |num_audio_samples|. This suggests that we should always get exactly what we ask for. Can you run this by hlundin@ to be absolutely sure? If we can get that confirmed, I think we should remove the loop and replace it with a CHECK (or DCHECK).
Message was sent while issue was closed.
Nice find, that seems to indicate that the while loop is not required. I will check with hlundin@ and upload a change after that.
Message was sent while issue was closed.
Description was changed from ========== Feed the render data to MediaStreamAudioProcessor and used AudioBus in render callback. This CL is mainly to feed the render data to MediaStreamAudioProcessor. But since it is better to have all the data format as media::AudioBus in chrome, so I changed the WebRtcAudioRendererSource::RenderData(int16*) to WebRtcAudioRendererSource::RenderData(media::AudioBus*), and also make MediaStreamAudioProcessor use WebRtcAudioRendererSource interface to get the render data. BUG=264611 TEST=content_unittests and AEC works when chrome ----enable-audio-track-processing Committed: https://src.chromium.org/viewvc/chrome?view=rev&revision=252554 ========== to ========== Feed the render data to MediaStreamAudioProcessor and used AudioBus in render callback. This CL is mainly to feed the render data to MediaStreamAudioProcessor. But since it is better to have all the data format as media::AudioBus in chrome, so I changed the WebRtcAudioRendererSource::RenderData(int16*) to WebRtcAudioRendererSource::RenderData(media::AudioBus*), and also make MediaStreamAudioProcessor use WebRtcAudioRendererSource interface to get the render data. BUG=264611 TEST=content_unittests and AEC works when chrome ----enable-audio-track-processing Committed: https://src.chromium.org/viewvc/chrome?view=rev&revision=252554 ==========
Message was sent while issue was closed.
henrika@chromium.org changed reviewers: + hlundin@chromium.org
Message was sent while issue was closed.
+Henrik Lundin (hlundin@)
Message was sent while issue was closed.
On 2016/02/01 15:40:30, henrika wrote: > Nice find, that seems to indicate that the while loop is not required. That explains why the incorrect addition never caused problems. /analyze reported some other warnings of the same type, I guess I'll look at those also. Thanks for investigating.
Message was sent while issue was closed.
On 2016/02/01 15:40:30, henrika wrote: > Nice find, that seems to indicate that the while loop is not required. > I will check with hlundin@ and upload a change after that. I don't know anything about this code. Sorry.
Message was sent while issue was closed.
On 2016/02/15 14:16:54, hlundin1 wrote: > On 2016/02/01 15:40:30, henrika wrote: > > Nice find, that seems to indicate that the while loop is not required. > > I will check with hlundin@ and upload a change after that. > > I don't know anything about this code. Sorry. The question for you was more for the code in VoE. However, as per comment #36 I found that these assumptions are already being asserted inside of webrtc and that's what we wanted to know. |