Autonomy Software C++ 24.5.1
Welcome to the Autonomy Software repository of the Mars Rover Design Team (MRDT) at Missouri University of Science and Technology (Missouri S&T)! API reference contains the source code and other resources for the development of the autonomy software for our Mars rover. The Autonomy Software project aims to compete in the University Rover Challenge (URC) by demonstrating advanced autonomous capabilities and robust navigation algorithms.
Loading...
Searching...
No Matches
WebRTC Class Reference

This class is used to establish a connection with the RoveSoSimulator and retrieve video streams from it. More...

#include <WebRTC.h>

Collaboration diagram for WebRTC:

Public Member Functions

 WebRTC (const std::string &szSignallingServerURL, const std::string &szStreamerID)
 Construct a new Web RTC::WebRTC object.
 
 ~WebRTC ()
 Destroy the Web RTC::WebRTC object.
 
void CloseConnection ()
 Close the WebRTC connection.
 
void SetOnFrameReceivedCallback (std::function< void(cv::Mat &)> fnOnFrameReceivedCallback, const AVPixelFormat eOutputPixelFormat=AV_PIX_FMT_BGR24)
 Set the callback function for when a new frame is received.
 
bool GetIsConnected () const
 Get the connection status of the WebRTC object.
 

Private Member Functions

bool ConnectToSignallingServer (const std::string &szSignallingServerURL)
 Connected to the Unreal Engine 5 hosted Signalling Server for WebRTC negotiation.
 
bool InitializeH264Decoder ()
 Initialize the H264 decoder. Creates the AVCodecContext, AVFrame, and AVPacket.
 
bool DecodeH264BytesToCVMat (const std::vector< uint8_t > &vH264EncodedBytes, cv::Mat &cvDecodedFrame, const AVPixelFormat eOutputPixelFormat)
 Decodes H264 encoded bytes to a cv::Mat using FFmpeg.
 
bool RequestKeyFrame ()
 Requests a key frame from the given video track. This is useful for when the video track is out of sync or has lost frames.
 
bool SendCommandToStreamer (const std::string &szCommand)
 This method sends a command to the streamer via the data channel. The command is a JSON string that is sent as a binary message. The PixelStreaming plugin handles the command very weirdly, so we have to sort of encode the command in a specific way. This handles that encoding.
 

Private Attributes

std::string m_szSignallingServerURL
 
std::string m_szStreamerID
 
std::shared_ptr< rtc::WebSocket > m_pWebSocket
 
std::shared_ptr< rtc::PeerConnection > m_pPeerConnection
 
std::shared_ptr< rtc::DataChannel > m_pDataChannel
 
std::shared_ptr< rtc::Track > m_pVideoTrack1
 
std::shared_ptr< rtc::H264RtpDepacketizer > m_pTrack1H264DepacketizationHandler
 
std::shared_ptr< rtc::RtcpReceivingSession > m_pTrack1RtcpReceivingSession
 
std::chrono::system_clock::time_point m_tmLastKeyFrameRequestTime
 
AVCodecContext * m_pAVCodecContext
 
AVFrame * m_pFrame
 
AVPacket * m_pPacket
 
SwsContext * m_pSWSContext
 
AVPixelFormat m_eOutputPixelFormat
 
std::shared_mutex m_muDecoderMutex
 
cv::Mat m_cvFrame
 
std::function< void(cv::Mat &)> m_fnOnFrameReceivedCallback
 

Detailed Description

This class is used to establish a connection with the RoveSoSimulator and retrieve video streams from it.

Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-11-30

Constructor & Destructor Documentation

◆ WebRTC()

WebRTC::WebRTC ( const std::string &  szSignallingServerURL,
const std::string &  szStreamerID 
)

Construct a new Web RTC::WebRTC object.

Parameters
szSignallingServerURL- The URL of the signalling server.
szStreamerID- The ID of the streamer.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-12-02
29{
30 // Set member variables.
31 m_szSignallingServerURL = szSignallingServerURL;
32 m_szStreamerID = szStreamerID;
33 m_tmLastKeyFrameRequestTime = std::chrono::system_clock::now();
34
35 // Setup the FFMPEG H264 decoder.
37
38 // Enable logging from the WebRTC LibDataChannel library for debugging.
39 // rtc::InitLogger(rtc::LogLevel::Verbose);
40
41 // Construct the WebRTC peer connection and data channel for receiving data from the simulator.
42 rtc::WebSocket::Configuration rtcWebSocketConfig;
43 rtc::Configuration rtcPeerConnectionConfig;
44 rtcPeerConnectionConfig.forceMediaTransport = true;
45 rtcPeerConnectionConfig.maxMessageSize = 100000000;
46 m_pWebSocket = std::make_shared<rtc::WebSocket>(rtcWebSocketConfig);
47 m_pPeerConnection = std::make_shared<rtc::PeerConnection>(rtcPeerConnectionConfig);
48 m_pDataChannel = m_pPeerConnection->createDataChannel("webrtc-datachannel");
49
50 // Attempt to connect to the signalling server.
51 this->ConnectToSignallingServer(szSignallingServerURL);
52}
bool InitializeH264Decoder()
Initialize the H264 decoder. Creates the AVCodecContext, AVFrame, and AVPacket.
Definition WebRTC.cpp:573
bool ConnectToSignallingServer(const std::string &szSignallingServerURL)
Connected to the Unreal Engine 5 hosted Signalling Server for WebRTC negotiation.
Definition WebRTC.cpp:177
Here is the call graph for this function:

◆ ~WebRTC()

WebRTC::~WebRTC ( )

Destroy the Web RTC::WebRTC object.

Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-12-02
62{
63 this->CloseConnection();
64
65 // Free the codec context.
66 if (m_pSWSContext)
67 {
68 sws_freeContext(m_pSWSContext);
69 }
70 if (m_pFrame)
71 {
72 av_frame_free(&m_pFrame);
73 }
74 if (m_pPacket)
75 {
76 av_packet_free(&m_pPacket);
77 }
78 if (m_pAVCodecContext)
79 {
80 avcodec_free_context(&m_pAVCodecContext);
81 }
82
83 // Set dangling pointers to nullptr.
84 m_pAVCodecContext = nullptr;
85 m_pFrame = nullptr;
86 m_pPacket = nullptr;
87 m_pSWSContext = nullptr;
88}
void CloseConnection()
Close the WebRTC connection.
Definition WebRTC.cpp:97
Here is the call graph for this function:

Member Function Documentation

◆ CloseConnection()

void WebRTC::CloseConnection ( )

Close the WebRTC connection.

Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2025-05-03
98{
99 // Close the WebRTC connections.
100 if (m_pVideoTrack1)
101 {
102 m_pVideoTrack1->resetCallbacks();
103 m_pVideoTrack1->close();
104 }
105 if (m_pDataChannel)
106 {
107 m_pDataChannel->resetCallbacks();
108 m_pDataChannel->close();
109 }
110 if (m_pPeerConnection)
111 {
112 m_pPeerConnection->resetCallbacks();
113 m_pPeerConnection->close();
114 }
115 if (m_pWebSocket)
116 {
117 m_pWebSocket->resetCallbacks();
118 m_pWebSocket->close();
119 }
120
121 // Wait for all connections to close.
122 while ((m_pVideoTrack1 && !m_pVideoTrack1->isClosed()) || (m_pDataChannel && !m_pDataChannel->isClosed()) || (m_pWebSocket && !m_pWebSocket->isClosed()))
123 {
124 std::this_thread::sleep_for(std::chrono::milliseconds(100));
125 }
126}
Here is the caller graph for this function:

◆ SetOnFrameReceivedCallback()

void WebRTC::SetOnFrameReceivedCallback ( std::function< void(cv::Mat &)>  fnOnFrameReceivedCallback,
const AVPixelFormat  eOutputPixelFormat = AV_PIX_FMT_BGR24 
)

Set the callback function for when a new frame is received.

Parameters
fnOnFrameReceivedCallback- The callback function to set.
eOutputPixelFormat- The output pixel format to use.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-12-02
138{
139 // Set the callback function.
140 m_fnOnFrameReceivedCallback = fnOnFrameReceivedCallback;
141 // Set the output pixel format.
142 m_eOutputPixelFormat = eOutputPixelFormat;
143}

◆ GetIsConnected()

bool WebRTC::GetIsConnected ( ) const

Get the connection status of the WebRTC object.

Returns
true - The WebRTC object is connected.
false - The WebRTC object is not connected.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-12-26
155{
156 // Check if the datachannel shared pointer is valid.
157 if (m_pWebSocket != nullptr)
158 {
159 // Check if the datachannel is open.
160 return m_pWebSocket->isOpen();
161 }
162
163 // Return false if the datachannel is not valid.
164 return false;
165}
Here is the caller graph for this function:

◆ ConnectToSignallingServer()

bool WebRTC::ConnectToSignallingServer ( const std::string &  szSignallingServerURL)
private

Connected to the Unreal Engine 5 hosted Signalling Server for WebRTC negotiation.

Parameters
szSignallingServerURL- The full URL of the signalling server. Should be in the format of "ws://<IP>:<PORT>"
Returns
true - Successfully connected to the signalling server.
false - Failed to connect to the signalling server.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-11-11
178{
179 // Connect to the signalling server via a websocket to handle WebRTC negotiation and signalling.
180 m_pWebSocket->open(szSignallingServerURL);
181
183 // Set some callbacks on important events for the websocket.
185
186 // WebSocket has been opened.
187 m_pWebSocket->onOpen(
188 [this]()
189 {
190 // Submit logger message.
191 LOG_INFO(logging::g_qSharedLogger, "Connected to the signalling server via {}. Checking if stream {} exists...", m_szSignallingServerURL, m_szStreamerID);
192
193 // Request the streamer list from the server. This also kicks off the negotiation process.
194 nlohmann::json jsnStreamList;
195 jsnStreamList["type"] = "listStreamers";
196 m_pWebSocket->send(jsnStreamList.dump());
197 });
198
199 // WebSocket has been closed.
200 m_pWebSocket->onClosed(
201 [this]()
202 {
203 // Submit logger message.
204 LOG_INFO(logging::g_qSharedLogger, "Closed {} stream and disconnected from the signalling server.", m_szStreamerID);
205 });
206
207 // Handling signalling server messages. (offer/answer/ICE candidate)
208 m_pWebSocket->onMessage(
209 [this](std::variant<rtc::binary, rtc::string> rtcMessage)
210 {
211 try
212 {
213 // Create instance variables.
214 nlohmann::json jsnMessage;
215
216 // Check if data is of type rtc::string.
217 if (std::holds_alternative<rtc::string>(rtcMessage))
218 {
219 // Retrieve the string message
220 std::string szMessage = std::get<rtc::string>(rtcMessage);
221
222 // Parse the JSON message from the signaling server.
223 jsnMessage = nlohmann::json::parse(szMessage);
224 LOG_DEBUG(logging::g_qSharedLogger, "Received message from signalling server: {}", szMessage);
225 }
226 else if (std::holds_alternative<rtc::binary>(rtcMessage))
227 {
228 // Retrieve the binary message.
229 rtc::binary rtcBinaryData = std::get<rtc::binary>(rtcMessage);
230 // Print length of binary data.
231 LOG_DEBUG(logging::g_qSharedLogger, "Received binary data of length: {}", rtcBinaryData.size());
232
233 // Convert the binary data to a string.
234 std::string szBinaryDataStr(reinterpret_cast<const char*>(rtcBinaryData.data()), rtcBinaryData.size());
235 // Print the binary data as a string.
236 LOG_DEBUG(logging::g_qSharedLogger, "Received binary data: {}", szBinaryDataStr);
237 // Parse the binary data as JSON.
238 jsnMessage = nlohmann::json::parse(szBinaryDataStr);
239 }
240 else
241 {
242 LOG_ERROR(logging::g_qSharedLogger, "Received unknown message type from signalling server");
243 }
244
245 // Check if the message contains a type.
246 if (jsnMessage.contains("type"))
247 {
248 std::string szType = jsnMessage["type"];
249 // If the message from the server is a config message, do nothing.
250 if (szType == "config")
251 {
252 // Submit logger message.
253 LOG_DEBUG(logging::g_qSharedLogger, "Received config message from signalling server: {}", jsnMessage.dump());
254 }
255 // If the message from the server is an offer, set the remote description offer.
256 else if (szType == "offer")
257 {
258 // Get the SDP offer and set it as the remote description.
259 std::string sdp = jsnMessage["sdp"];
260 m_pPeerConnection->setRemoteDescription(rtc::Description(sdp, "offer"));
261 LOG_DEBUG(logging::g_qSharedLogger, "Processing SDP offer from signalling server: {}", sdp);
262 }
263 // If the message from the server is an answer, set the remote description answer.
264 else if (szType == "answer")
265 {
266 // Get the SDP answer and set it as the remote description.
267 std::string sdp = jsnMessage["sdp"];
268 m_pPeerConnection->setRemoteDescription(rtc::Description(sdp, "answer"));
269 LOG_DEBUG(logging::g_qSharedLogger, "Processing SDP answer from signalling server: {}", sdp);
270 }
271 // If the message from the server is advertising an ICE candidate, add it to the peer connection.
272 else if (szType == "iceCandidate")
273 {
274 // Handle ICE candidate
275 nlohmann::json jsnCandidate = jsnMessage["candidate"];
276 std::string szCandidateStr = jsnCandidate["candidate"];
277
278 rtc::Candidate rtcCandidate = rtc::Candidate(szCandidateStr);
279 m_pPeerConnection->addRemoteCandidate(rtcCandidate);
280 LOG_DEBUG(logging::g_qSharedLogger, "Added ICE candidate to peer connection: {}", szCandidateStr);
281 }
282 else if (szType == "streamerList")
283 {
284 // Print the streamer list.
285 LOG_DEBUG(logging::g_qSharedLogger, "Streamer List: {}", jsnMessage.dump());
286
287 // Check that the streamer ID given by the user is in the streamer list.
288 if (jsnMessage.contains("ids"))
289 {
290 std::vector<std::string> streamerList = jsnMessage["ids"].get<std::vector<std::string>>();
291 if (std::find(streamerList.begin(), streamerList.end(), m_szStreamerID) != streamerList.end())
292 {
293 // Send what stream we want to the server.
294 nlohmann::json jsnStream;
295 jsnStream["type"] = "subscribe";
296 jsnStream["streamerId"] = m_szStreamerID;
297 m_pWebSocket->send(jsnStream.dump());
298 // Submit logger message.
299 LOG_DEBUG(logging::g_qSharedLogger, "Streamer ID {} found in the streamer list. Subscribing to stream...", m_szStreamerID);
300 }
301 else
302 {
303 LOG_ERROR(logging::g_qSharedLogger, "Streamer ID {} not found in the streamer list!", m_szStreamerID);
304 }
305 }
306 else
307 {
308 LOG_ERROR(logging::g_qSharedLogger, "Streamer list does not contain 'ids' field!");
309 }
310 }
311 else
312 {
313 LOG_ERROR(logging::g_qSharedLogger, "Unknown message type received from signalling server: {}", szType);
314 }
315 }
316 }
317 catch (const std::exception& e)
318 {
319 // Submit logger message.
320 LOG_ERROR(logging::g_qSharedLogger, "Error occurred while negotiating with the Signalling Server: {}", e.what());
321 }
322 });
323
324 m_pWebSocket->onError(
325 [this](const std::string& szError)
326 {
327 // Submit logger message.
328 LOG_ERROR(logging::g_qSharedLogger, "Error occurred on WebSocket: {}", szError);
329 });
330
332 // Set some callbacks on important events for the peer connection.
334
335 m_pPeerConnection->onLocalDescription(
336 [this](rtc::Description rtcDescription)
337 {
338 // Check the type of the description.
339 if (rtcDescription.typeString() == "offer")
340 {
341 return;
342 }
343
344 // First lets send some preconfig stuff.
345 nlohmann::json jsnConfigMessage;
346 jsnConfigMessage["type"] = "layerPreference";
347 jsnConfigMessage["spatialLayer"] = 0;
348 jsnConfigMessage["temporalLayer"] = 0;
349 jsnConfigMessage["playerId"] = "";
350 m_pWebSocket->send(jsnConfigMessage.dump());
351
352 // Send the local description to the signalling server
353 nlohmann::json jsnMessage;
354 jsnMessage["type"] = rtcDescription.typeString();
355 // This next bit is specific to the Unreal Engine 5 Signalling Server, we must append the min/max bitrate to the message.
356 jsnMessage["minBitrateBps"] = 0;
357 jsnMessage["maxBitrateBps"] = 0;
358 // Here's our actual SDP.
359 std::string szSDP = rtcDescription.generateSdp();
360 // Munger the SDP to add the bitrate.
361 std::string szMungedSDP =
362 std::regex_replace(szSDP, std::regex("(a=fmtp:\\d+ level-asymmetry-allowed=.*)\r\n"), "$1;x-google-start-bitrate=10000;x-google-max-bitrate=100000\r\n");
363 jsnMessage["sdp"] = szMungedSDP;
364 // Send the message.
365 m_pWebSocket->send(jsnMessage.dump());
366
367 // Submit logger message.
368 LOG_DEBUG(logging::g_qSharedLogger, "Sending local description to signalling server: {}", jsnMessage.dump());
369 });
370
371 m_pPeerConnection->onTrack(
372 [this](std::shared_ptr<rtc::Track> rtcTrack)
373 {
374 // Submit logger message.
375 rtc::Description::Media rtcMediaDescription = rtcTrack->description();
376 // Get some information about the track.
377 std::string szMediaType = rtcMediaDescription.type();
378
379 // Check if the track is a video track.
380 if (szMediaType != "video")
381 {
382 return;
383 }
384
385 // Set member variable to the video track.
386 m_pVideoTrack1 = rtcTrack;
387
388 // Create a H264 depacketization handler and rtcp receiving session.
389 m_pTrack1H264DepacketizationHandler = std::make_shared<rtc::H264RtpDepacketizer>(rtc::NalUnit::Separator::LongStartSequence);
390 m_pTrack1RtcpReceivingSession = std::make_shared<rtc::RtcpReceivingSession>();
391 m_pTrack1H264DepacketizationHandler->addToChain(m_pTrack1RtcpReceivingSession);
392 m_pVideoTrack1->setMediaHandler(m_pTrack1H264DepacketizationHandler);
393
394 // Set the onMessage callback for the video track.
395 m_pVideoTrack1->onFrame(
396 [this](rtc::binary rtcBinaryMessage, rtc::FrameInfo rtcFrameInfo)
397 {
398 // Assuming 96 is the H264 payload type.
399 if (rtcFrameInfo.payloadType == 96)
400 {
401 // Change the rtc::Binary (std::vector<std::byte>) to a std::vector<uint8_t>.
402 std::vector<uint8_t> vH264EncodedBytes;
403 vH264EncodedBytes.reserve(rtcBinaryMessage.size());
404 for (std::byte stdByte : rtcBinaryMessage)
405 {
406 vH264EncodedBytes.push_back(static_cast<uint8_t>(stdByte));
407 }
408
409 // Acquire a mutex lock on the shared_mutex before calling sws_scale.
410 std::unique_lock<std::shared_mutex> lkDecoderLock(m_muDecoderMutex);
411 // Pass to FFmpeg decoder
412 this->DecodeH264BytesToCVMat(vH264EncodedBytes, m_cvFrame, m_eOutputPixelFormat);
413
414 // Check if the callback function is set before calling it.
415 if (m_fnOnFrameReceivedCallback)
416 {
417 // Call the user's callback function.
418 m_fnOnFrameReceivedCallback(m_cvFrame);
419 }
420 // Release the lock on the shared_mutex.
421 lkDecoderLock.unlock();
422 }
423 });
424 });
425
426 m_pPeerConnection->onGatheringStateChange(
427 [this](rtc::PeerConnection::GatheringState eGatheringState)
428 {
429 // Switch to translate the state to a string.
430 switch (eGatheringState)
431 {
432 case rtc::PeerConnection::GatheringState::Complete: LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection ICE gathering state changed to: Complete"); break;
433
434 case rtc::PeerConnection::GatheringState::InProgress:
435 LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection ICE gathering state changed to: InProgress");
436 break;
437 case rtc::PeerConnection::GatheringState::New: LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection ICE gathering state changed to: New"); break;
438 default: LOG_DEBUG(logging::g_qSharedLogger, "Peer connection ICE gathering state changed to: Unknown"); break;
439 }
440 });
441
442 m_pPeerConnection->onIceStateChange(
443 [this](rtc::PeerConnection::IceState eIceState)
444 {
445 // Switch to translate the state to a string.
446 switch (eIceState)
447 {
448 case rtc::PeerConnection::IceState::Checking: LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection ICE state changed to: Checking"); break;
449 case rtc::PeerConnection::IceState::Closed: LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection ICE state changed to: Closed"); break;
450 case rtc::PeerConnection::IceState::Completed: LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection ICE state changed to: Completed"); break;
451 case rtc::PeerConnection::IceState::Connected: LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection ICE state changed to: Connected"); break;
452 case rtc::PeerConnection::IceState::Disconnected: LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection ICE state changed to: Disconnected"); break;
453 case rtc::PeerConnection::IceState::Failed: LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection ICE state changed to: Failed"); break;
454 case rtc::PeerConnection::IceState::New: LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection ICE state changed to: New"); break;
455 default: LOG_DEBUG(logging::g_qSharedLogger, "Peer connection ICE state changed to: Unknown"); break;
456 }
457 });
458 m_pPeerConnection->onSignalingStateChange(
459 [this](rtc::PeerConnection::SignalingState eSignalingState)
460 {
461 // Switch to translate the state to a string.
462 switch (eSignalingState)
463 {
464 case rtc::PeerConnection::SignalingState::HaveLocalOffer:
465 {
466 LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection signaling state changed to: HaveLocalOffer");
467 break;
468 }
469 case rtc::PeerConnection::SignalingState::HaveLocalPranswer:
470 LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection signaling state changed to: HaveLocalPranswer");
471 break;
472 case rtc::PeerConnection::SignalingState::HaveRemoteOffer:
473 LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection signaling state changed to: HaveRemoteOffer");
474 break;
475 case rtc::PeerConnection::SignalingState::HaveRemotePranswer:
476 LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection signaling state changed to: HaveRemotePrAnswer");
477 break;
478 case rtc::PeerConnection::SignalingState::Stable: LOG_DEBUG(logging::g_qSharedLogger, "PeerConnection signaling state changed to: Stable"); break;
479 default: LOG_DEBUG(logging::g_qSharedLogger, "Peer connection signaling state changed to: Unknown"); break;
480 }
481 });
482
483 m_pPeerConnection->onStateChange(
484 [this](rtc::PeerConnection::State eState)
485 {
486 // Switch to translate the state to a string.
487 switch (eState)
488 {
489 case rtc::PeerConnection::State::Closed: LOG_DEBUG(logging::g_qSharedLogger, "Peer connection state changed to: Closed"); break;
490 case rtc::PeerConnection::State::Connected: LOG_DEBUG(logging::g_qSharedLogger, "Peer connection state changed to: Connected"); break;
491 case rtc::PeerConnection::State::Connecting: LOG_DEBUG(logging::g_qSharedLogger, "Peer connection state changed to: Connecting"); break;
492 case rtc::PeerConnection::State::Disconnected: LOG_DEBUG(logging::g_qSharedLogger, "Peer connection state changed to: Disconnected"); break;
493 case rtc::PeerConnection::State::Failed: LOG_DEBUG(logging::g_qSharedLogger, "Peer connection state changed to: Failed"); break;
494 case rtc::PeerConnection::State::New: LOG_DEBUG(logging::g_qSharedLogger, "Peer connection state changed to: New"); break;
495 default: LOG_DEBUG(logging::g_qSharedLogger, "Peer connection state changed to: Unknown"); break;
496 }
497 });
498
500 // Set some callbacks on important events for the data channel.
502
503 m_pDataChannel->onOpen(
504 [this]()
505 {
506 // Submit logger message.
507 LOG_INFO(logging::g_qSharedLogger, "Data channel opened.");
508
509 // Request quality control of the stream.
510 m_pDataChannel->send(std::string(1, static_cast<char>(1)));
511 });
512
513 m_pDataChannel->onMessage(
514 [this](std::variant<rtc::binary, rtc::string> rtcMessage)
515 {
516 try
517 {
518 // Create instance variables.
519 nlohmann::json jsnMessage;
520
521 // Check if data is of type rtc::string.
522 if (std::holds_alternative<rtc::string>(rtcMessage))
523 {
524 // Retrieve the string message
525 std::string szMessage = std::get<rtc::string>(rtcMessage);
526
527 // Parse the JSON message from the signaling server.
528 jsnMessage = nlohmann::json::parse(szMessage);
529 LOG_NOTICE(logging::g_qSharedLogger, "DATA_CHANNEL Received message from peer: {}", szMessage);
530 }
531 else if (std::holds_alternative<rtc::binary>(rtcMessage))
532 {
533 // Retrieve the binary message.
534 rtc::binary rtcBinaryData = std::get<rtc::binary>(rtcMessage);
535
536 // Convert the binary data to a string, ignoring non-printable characters.
537 std::string szBinaryDataStr;
538 for (auto byte : rtcBinaryData)
539 {
540 if (std::isprint(static_cast<unsigned char>(byte)))
541 {
542 szBinaryDataStr += static_cast<char>(byte);
543 }
544 }
545
546 // Print the binary data as a string.
547 LOG_DEBUG(logging::g_qSharedLogger, "DATA_CHANNEL Received binary data ({} bytes): {}", rtcBinaryData.size(), szBinaryDataStr);
548 }
549 else
550 {
551 LOG_ERROR(logging::g_qSharedLogger, "Received unknown message type from peer");
552 }
553 }
554 catch (const std::exception& e)
555 {
556 // Submit logger message.
557 LOG_ERROR(logging::g_qSharedLogger, "Error occurred while negotiating with the datachannel: {}", e.what());
558 }
559 });
560
561 return true;
562}
bool DecodeH264BytesToCVMat(const std::vector< uint8_t > &vH264EncodedBytes, cv::Mat &cvDecodedFrame, const AVPixelFormat eOutputPixelFormat)
Decodes H264 encoded bytes to a cv::Mat using FFmpeg.
Definition WebRTC.cpp:635
Here is the call graph for this function:
Here is the caller graph for this function:

◆ InitializeH264Decoder()

bool WebRTC::InitializeH264Decoder ( )
private

Initialize the H264 decoder. Creates the AVCodecContext, AVFrame, and AVPacket.

Returns
true - Successfully initialized the H264 decoder.
false - Failed to initialize the H264 decoder.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-12-27
574{
575 // Configure logging level from FFMPEG library.
576 av_log_set_level(AV_LOG_DEBUG);
577
578 // Find the H264 decoder
579 const AVCodec* avCodec = avcodec_find_decoder(AV_CODEC_ID_H264);
580 if (!avCodec)
581 {
582 LOG_ERROR(logging::g_qSharedLogger, "H264 codec not found!");
583 return false;
584 }
585 // Create codec context
586 m_pAVCodecContext = avcodec_alloc_context3(avCodec);
587 if (!m_pAVCodecContext)
588 {
589 LOG_ERROR(logging::g_qSharedLogger, "Failed to allocate codec context!");
590 return false;
591 }
592 // Set codec context options.
593 m_pAVCodecContext->flags |= AV_CODEC_FLAG2_FAST;
594 m_pAVCodecContext->err_recognition = AV_EF_COMPLIANT | AV_EF_CAREFUL;
595 m_pAVCodecContext->rc_buffer_size = 50 * 1024 * 1024; // 50 MB buffer size.
596 av_opt_set_int(m_pAVCodecContext, "refcounted_frames", 1, 0);
597 av_opt_set_int(m_pAVCodecContext, "error_concealment", FF_EC_GUESS_MVS | FF_EC_DEBLOCK, 0);
598 av_opt_set_int(m_pAVCodecContext, "threads", 4, 0);
599
600 // Open the codec
601 if (avcodec_open2(m_pAVCodecContext, avCodec, nullptr) < 0)
602 {
603 LOG_ERROR(logging::g_qSharedLogger, "Failed to open codec!");
604 return false;
605 }
606
607 // Allocate the AVPacket.
608 m_pPacket = av_packet_alloc();
609 // Allocate the AVFrame.
610 m_pFrame = av_frame_alloc();
611 if (!m_pPacket || !m_pFrame)
612 {
613 LOG_ERROR(logging::g_qSharedLogger, "Failed to allocate packet or frame!");
614 return false;
615 }
616
617 // Set the SWSScale context to nullptr.
618 m_pSWSContext = nullptr;
619
620 return true;
621}
Here is the caller graph for this function:

◆ DecodeH264BytesToCVMat()

bool WebRTC::DecodeH264BytesToCVMat ( const std::vector< uint8_t > &  vH264EncodedBytes,
cv::Mat cvDecodedFrame,
const AVPixelFormat  eOutputPixelFormat 
)
private

Decodes H264 encoded bytes to a cv::Mat using FFmpeg.

Parameters
vH264EncodedBytes- The H264 encoded bytes.
cvDecodedFrame- The decoded frame.
eOutputPixelFormat- The output pixel format.
Returns
true - Frame was successfully decoded.
false - Frame was not successfully decoded.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-11-16
636{
637 // Initialize packet data.
638 m_pPacket->data = const_cast<uint8_t*>(vH264EncodedBytes.data());
639 m_pPacket->size = static_cast<int>(vH264EncodedBytes.size());
640
641 // Send the packet to the decoder.
642 int nReturnCode = avcodec_send_packet(m_pAVCodecContext, m_pPacket);
643 if (nReturnCode < 0)
644 {
645 // Get the error message.
646 char aErrorBuffer[AV_ERROR_MAX_STRING_SIZE];
647 av_strerror(nReturnCode, aErrorBuffer, AV_ERROR_MAX_STRING_SIZE);
648 // Submit logger message.
649 LOG_NOTICE(logging::g_qSharedLogger,
650 "Failed to send packet to decoder! Error code: {} {}. This is not a serious problem and is likely just because some of the UDP RTP packets didn't "
651 "make it to us.",
652 nReturnCode,
653 aErrorBuffer);
654 // Request a new keyframe from the video track.
655 this->RequestKeyFrame();
656
657 return false;
658 }
659
660 // Receive decoded frames in a loop
661 while (true)
662 {
663 nReturnCode = avcodec_receive_frame(m_pAVCodecContext, m_pFrame);
664 if (nReturnCode == AVERROR(EAGAIN) || nReturnCode == AVERROR_EOF)
665 {
666 // No more frames available in stream.
667 break;
668 }
669 else if (nReturnCode < 0)
670 {
671 // Get the error message.
672 char aErrorBuffer[AV_ERROR_MAX_STRING_SIZE];
673 av_strerror(nReturnCode, aErrorBuffer, AV_ERROR_MAX_STRING_SIZE);
674 // Submit logger message.
675 LOG_WARNING(logging::g_qSharedLogger, "Failed to receive frame from decoder! Error code: {} {}", nReturnCode, aErrorBuffer);
676 // Request a new keyframe from the video track.
677 this->RequestKeyFrame();
678
679 return false;
680 }
681
682 // Check if the user want to keep the YUV420P data un-altered.
683 if (eOutputPixelFormat == AV_PIX_FMT_YUV420P)
684 {
685 // The frame received from the FFMPEG H264 decoder is already in YUV420P format.
686 // We want to keep the raw YUV420P byte data un-altered, but store that data in a RGB 3 channel Mat.
687 // Absolutely no colorspace conversion or the binary data will be corrupted.
688
689 // Extract the Y, U, and V planes.
690 cv::Mat cvYPlane(m_pFrame->height, m_pFrame->width, CV_8UC1, m_pFrame->data[0]);
691 cv::Mat cvUPlane(m_pFrame->height / 2, m_pFrame->width / 2, CV_8UC1, m_pFrame->data[1]);
692 cv::Mat cvVPlane(m_pFrame->height / 2, m_pFrame->width / 2, CV_8UC1, m_pFrame->data[2]);
693 // Upsample the U and V planes to match the Y plane.
694 cv::Mat cvUPlaneUpsampled, cvVPlaneUpsampled;
695 cv::resize(cvUPlane, cvUPlaneUpsampled, cv::Size(m_pFrame->width, m_pFrame->height), 0, 0, cv::INTER_NEAREST);
696 cv::resize(cvVPlane, cvVPlaneUpsampled, cv::Size(m_pFrame->width, m_pFrame->height), 0, 0, cv::INTER_NEAREST);
697 // Merge the Y, U, and V planes into a single 3 channel Mat.
698 std::vector<cv::Mat> vYUVPlanes = {cvYPlane, cvUPlaneUpsampled, cvVPlaneUpsampled};
699 cv::merge(vYUVPlanes, cvDecodedFrame);
700 }
701 else
702 {
703 // Convert the decoded frame to cv::Mat using sws_scale.
704 if (m_pSWSContext == nullptr)
705 {
706 m_pSWSContext = sws_getContext(m_pFrame->width,
707 m_pFrame->height,
708 static_cast<AVPixelFormat>(m_pFrame->format),
709 m_pFrame->width,
710 m_pFrame->height,
711 eOutputPixelFormat,
712 SWS_FAST_BILINEAR,
713 nullptr,
714 nullptr,
715 nullptr);
716 if (m_pSWSContext == nullptr)
717 {
718 // Submit logger message.
719 LOG_WARNING(logging::g_qSharedLogger, "Failed to initialize SwsContext!");
720 // Request a new keyframe from the video track.
721 this->RequestKeyFrame();
722
723 return false;
724 }
725
726 // Allocate buffer for the frame's data
727 int nRetCode = av_image_alloc(m_pFrame->data, m_pFrame->linesize, m_pFrame->width, m_pFrame->height, static_cast<AVPixelFormat>(m_pFrame->format), 32);
728 if (nRetCode < 0)
729 {
730 // Submit logger message.
731 LOG_WARNING(logging::g_qSharedLogger, "Failed to allocate image buffer!");
732 return false;
733 }
734 }
735
736 // Create new mat for the decoded frame.
737 cvDecodedFrame.create(m_pFrame->height, m_pFrame->width, CV_8UC3);
738 std::array<uint8_t*, 4> aDest = {cvDecodedFrame.data, nullptr, nullptr, nullptr};
739 std::array<int, 4> aDestLinesize = {static_cast<int>(cvDecodedFrame.step[0]), 0, 0, 0};
740
741 // Convert the frame to the output pixel format.
742 sws_scale(m_pSWSContext, m_pFrame->data, m_pFrame->linesize, 0, m_pFrame->height, aDest.data(), aDestLinesize.data());
743 }
744
745 // Calculate the time since the last key frame request.
746 std::chrono::duration<double> tmTimeSinceLastKeyFrameRequest = std::chrono::system_clock::now() - m_tmLastKeyFrameRequestTime;
747 // Check if the time since the last key frame request is greater than the key frame request interval.
748 if (tmTimeSinceLastKeyFrameRequest.count() > 1.0)
749 {
750 // Request a new key frame from the video track.
751 // this->RequestKeyFrame();
752
753 // Set the QP factor to 0. (Max quality)
754 this->SendCommandToStreamer("{\"Encoder.MaxQP\":" + std::to_string(constants::SIM_WEBRTC_QP) + "}");
755 // Set the bitrate limits.
756 this->SendCommandToStreamer(R"({"WebRTC.MinBitrate":99999})");
757 this->SendCommandToStreamer(R"({"WebRTC.MaxBitrate":99999999})");
758 // Set FPS to 30.
759 this->SendCommandToStreamer(R"({"WebRTC.Fps":30})");
760 this->SendCommandToStreamer(R"({"WebRTC.MaxFps":30})");
761 // Here's the magic, this might work. Target bitrate is what has been causing the issues.
762 this->SendCommandToStreamer(R"({"Encoder.TargetBitrate":99999999})");
763
764 // Update the time of the last key frame request.
765 m_tmLastKeyFrameRequestTime = std::chrono::system_clock::now();
766 }
767 }
768
769 return true;
770}
bool RequestKeyFrame()
Requests a key frame from the given video track. This is useful for when the video track is out of sy...
Definition WebRTC.cpp:782
bool SendCommandToStreamer(const std::string &szCommand)
This method sends a command to the streamer via the data channel. The command is a JSON string that i...
Definition WebRTC.cpp:815
uchar * data
void create(int rows, int cols, int type)
MatStep step
void merge(const Mat *mv, size_t count, OutputArray dst)
void resize(InputArray src, OutputArray dst, Size dsize, double fx=0, double fy=0, int interpolation=INTER_LINEAR)
INTER_NEAREST
::uint8_t uint8_t
Here is the call graph for this function:
Here is the caller graph for this function:

◆ RequestKeyFrame()

bool WebRTC::RequestKeyFrame ( )
private

Requests a key frame from the given video track. This is useful for when the video track is out of sync or has lost frames.

Returns
true - Key frame was successfully requested.
false - Key frame was not successfully requested.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-11-30
783{
784 // Check if the video track is valid.
785 if (!m_pVideoTrack1)
786 {
787 LOG_ERROR(logging::g_qSharedLogger, "Invalid video track!");
788 return false;
789 }
790
791 // Submit logger message.
792 LOG_DEBUG(logging::g_qSharedLogger, "Requested key frame from video track. Success?: {}", m_pVideoTrack1->requestKeyframe());
793
794 // Request a key frame from the video track.
795 return true;
796}
Here is the caller graph for this function:

◆ SendCommandToStreamer()

bool WebRTC::SendCommandToStreamer ( const std::string &  szCommand)
private

This method sends a command to the streamer via the data channel. The command is a JSON string that is sent as a binary message. The PixelStreaming plugin handles the command very weirdly, so we have to sort of encode the command in a specific way. This handles that encoding.

Parameters
szCommand- The command to send to the streamer.
Returns
true - Command was successfully sent.
false - Command was not successfully sent.
Note
This will only work for valid COMMANDS with ID of type 51. Check the PixelStreamingInfrastructure repo for more information. https://github.com/EpicGamesExt/PixelStreamingInfrastructure/blob/13ce022d3a09d315d4ca85c05b61a8d3fe92741c/Extras/JSStreamer/src/protocol.ts#L196
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2025-01-01
816{
817 // Check if the data channel is valid.
818 if (!m_pDataChannel)
819 {
820 LOG_ERROR(logging::g_qSharedLogger, "Invalid data channel!");
821 return false;
822 }
823
824 // Check if the command is empty.
825 if (szCommand.empty())
826 {
827 LOG_ERROR(logging::g_qSharedLogger, "Empty command!");
828 return false;
829 }
830
831 // Set the max QP to 0.
832 std::string szID(1, static_cast<char>(51)); // This is the ID for COMMAND.
833 std::string szSize(1, static_cast<char>(szCommand.size()));
834 std::string szFinal = szID + szSize + szCommand;
835 // Loop through the string and build a binary message.
836 rtc::binary rtcBinaryMessage;
837 // First byte is the ID.
838 rtcBinaryMessage.push_back(static_cast<std::byte>(szID[0]));
839 // Next two bytes are the size.
840 rtcBinaryMessage.push_back(static_cast<std::byte>(szSize[0]));
841 rtcBinaryMessage.push_back(static_cast<std::byte>(szSize[1]));
842 // The rest of the bytes are the command, but this is utf16 so we need to add a null byte before each character.
843 for (char cLetter : szCommand)
844 {
845 rtcBinaryMessage.push_back(static_cast<std::byte>(cLetter));
846 rtcBinaryMessage.push_back(static_cast<std::byte>(0));
847 }
848
849 // Send the binary message.
850 return m_pDataChannel->send(rtcBinaryMessage);
851}
Here is the caller graph for this function:

The documentation for this class was generated from the following files: