1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197
|
// Copyright 2014 The Chromium Authors
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#include "third_party/blink/renderer/modules/mediastream/media_stream_renderer_factory.h"
#include <utility>
#include "base/memory/scoped_refptr.h"
#include "base/task/sequenced_task_runner.h"
#include "base/task/single_thread_task_runner.h"
#include "third_party/blink/public/platform/modules/mediastream/web_media_stream.h"
#include "third_party/blink/public/platform/modules/webrtc/webrtc_logging.h"
#include "third_party/blink/public/platform/platform.h"
#include "third_party/blink/public/web/web_local_frame.h"
#include "third_party/blink/renderer/core/execution_context/execution_context.h"
#include "third_party/blink/renderer/core/frame/local_dom_window.h"
#include "third_party/blink/renderer/core/frame/local_frame.h"
#include "third_party/blink/renderer/modules/mediastream/media_stream_video_renderer_sink.h"
#include "third_party/blink/renderer/modules/mediastream/media_stream_video_track.h"
#include "third_party/blink/renderer/modules/mediastream/track_audio_renderer.h"
#include "third_party/blink/renderer/modules/peerconnection/peer_connection_dependency_factory.h"
#include "third_party/blink/renderer/modules/webrtc/webrtc_audio_device_impl.h"
#include "third_party/blink/renderer/modules/webrtc/webrtc_audio_renderer.h"
#include "third_party/blink/renderer/platform/mediastream/media_stream_audio_track.h"
#include "third_party/blink/renderer/platform/mediastream/media_stream_descriptor.h"
#include "third_party/blink/renderer/platform/webrtc/peer_connection_remote_audio_source.h"
#include "third_party/blink/renderer/platform/wtf/text/wtf_string.h"
#include "third_party/webrtc/api/media_stream_interface.h"
namespace blink {
namespace {
// Returns a valid session id if a single WebRTC capture device is currently
// open (and then the matching session_id), otherwise 0.
// This is used to pass on a session id to an audio renderer, so that audio will
// be rendered to a matching output device, should one exist.
// Note that if there are more than one open capture devices the function
// will not be able to pick an appropriate device and return 0.
base::UnguessableToken GetSessionIdForWebRtcAudioRenderer(
ExecutionContext& context) {
WebRtcAudioDeviceImpl* audio_device =
PeerConnectionDependencyFactory::From(context).GetWebRtcAudioDevice();
return audio_device
? audio_device->GetAuthorizedDeviceSessionIdForAudioRenderer()
: base::UnguessableToken();
}
void SendLogMessage(const WTF::String& message) {
WebRtcLogMessage("MSRF::" + message.Utf8());
}
} // namespace
MediaStreamRendererFactory::MediaStreamRendererFactory() {}
MediaStreamRendererFactory::~MediaStreamRendererFactory() {}
scoped_refptr<MediaStreamVideoRenderer>
MediaStreamRendererFactory::GetVideoRenderer(
const WebMediaStream& web_stream,
const MediaStreamVideoRenderer::RepaintCB& repaint_cb,
scoped_refptr<base::SequencedTaskRunner> video_task_runner,
scoped_refptr<base::SingleThreadTaskRunner> main_render_task_runner) {
DCHECK(!web_stream.IsNull());
DVLOG(1) << "MediaStreamRendererFactory::GetVideoRenderer stream:"
<< web_stream.Id().Utf8();
MediaStreamDescriptor& descriptor = *web_stream;
auto video_components = descriptor.VideoComponents();
if (video_components.empty() ||
!MediaStreamVideoTrack::GetTrack(
WebMediaStreamTrack(video_components[0].Get()))) {
return nullptr;
}
return base::MakeRefCounted<MediaStreamVideoRendererSink>(
video_components[0].Get(), repaint_cb, std::move(video_task_runner),
std::move(main_render_task_runner));
}
scoped_refptr<MediaStreamAudioRenderer>
MediaStreamRendererFactory::GetAudioRenderer(
const WebMediaStream& web_stream,
WebLocalFrame* web_frame,
const WebString& device_id,
base::RepeatingCallback<void()> on_render_error_callback) {
DCHECK(!web_stream.IsNull());
SendLogMessage(String::Format("%s({web_stream_id=%s}, {device_id=%s})",
__func__, web_stream.Id().Utf8().c_str(),
device_id.Utf8().c_str()));
MediaStreamDescriptor& descriptor = *web_stream;
auto audio_components = descriptor.AudioComponents();
if (audio_components.empty()) {
// The stream contains no audio tracks. Log error message if the stream
// contains no video tracks either. Without this extra check, video-only
// streams would generate error messages at this stage and we want to
// avoid that.
auto video_tracks = descriptor.VideoComponents();
if (video_tracks.empty()) {
SendLogMessage(String::Format(
"%s => (ERROR: no audio tracks in media stream)", __func__));
}
return nullptr;
}
// TODO(crbug.com/400764478): We need to fix the data flow so that
// it works the same way for all track implementations, local, remote or what
// have you.
// In this function, we should simply create a renderer object that receives
// and mixes audio from all the tracks that belong to the media stream.
// For now, we have separate renderers depending on if the first audio track
// in the stream is local or remote.
MediaStreamAudioTrack* audio_track =
MediaStreamAudioTrack::From(audio_components[0].Get());
if (!audio_track) {
// This can happen if the track was cloned.
// TODO(tommi, perkj): Fix cloning of tracks to handle extra data too.
SendLogMessage(String::Format(
"%s => (ERROR: no native track for WebMediaStreamTrack)", __func__));
return nullptr;
}
auto* frame = To<LocalFrame>(WebLocalFrame::ToCoreFrame(*web_frame));
DCHECK(frame);
// If the track has a local source, or is a remote track that does not use the
// WebRTC audio pipeline, return a new TrackAudioRenderer instance.
if (!PeerConnectionRemoteAudioTrack::From(audio_track)) {
// TODO(xians): Add support for the case where the media stream contains
// multiple audio tracks.
SendLogMessage(String::Format(
"%s => (creating TrackAudioRenderer for %s audio track)", __func__,
audio_track->is_local_track() ? "local" : "remote"));
return base::MakeRefCounted<TrackAudioRenderer>(
audio_components[0].Get(), *frame, String(device_id),
std::move(on_render_error_callback));
}
// Get the AudioDevice associated with the frame where this track was created,
// in case the track has been moved to eg a same origin iframe. Without this,
// one can get into a situation where media is piped to a different audio
// device to that where control signals are sent, leading to no audio being
// played out - see crbug/1239207.
WebLocalFrame* track_creation_frame =
audio_components[0].Get()->CreationFrame();
if (track_creation_frame) {
frame = To<LocalFrame>(WebLocalFrame::ToCoreFrame(*track_creation_frame));
}
// This is a remote WebRTC media stream.
WebRtcAudioDeviceImpl* audio_device =
PeerConnectionDependencyFactory::From(*frame->DomWindow())
.GetWebRtcAudioDevice();
DCHECK(audio_device);
SendLogMessage(String::Format(
"%s => (media stream is a remote WebRTC stream)", __func__));
// Share the existing renderer if any, otherwise create a new one.
scoped_refptr<WebRtcAudioRenderer> renderer(audio_device->renderer());
if (renderer) {
SendLogMessage(String::Format(
"%s => (using existing WebRtcAudioRenderer for remote stream)",
__func__));
} else {
SendLogMessage(String::Format(
"%s => (creating new WebRtcAudioRenderer for remote stream)",
__func__));
renderer = base::MakeRefCounted<WebRtcAudioRenderer>(
PeerConnectionDependencyFactory::From(*frame->DomWindow())
.GetWebRtcSignalingTaskRunner(),
web_stream, *web_frame,
GetSessionIdForWebRtcAudioRenderer(*frame->DomWindow()),
String(device_id), std::move(on_render_error_callback));
if (!audio_device->SetAudioRenderer(renderer.get())) {
SendLogMessage(String::Format(
"%s => (ERROR: WRADI::SetAudioRenderer failed)", __func__));
return nullptr;
}
}
auto ret = renderer->CreateSharedAudioRendererProxy(web_stream);
if (!ret) {
SendLogMessage(String::Format(
"%s => (ERROR: CreateSharedAudioRendererProxy failed)", __func__));
}
return ret;
}
} // namespace blink
|