1, Introduction
WebRTC concept
WebRTC is led by Google and consists of a set of standards, protocols and JavaScript APIs, which are used to share audio, video and data between browsers (end-to-end). WebRTC does not need to install any plug-ins, and real-time communication can become a standard function through a simple JavaScript API.
Why webrtc
Now major browsers and have gradually increased their support for webrtc technology. The following figure shows the supported browsers and platforms on the official website of webrtc.
2, H5 play webrtc
After continuous exploration, there is basically no current library to directly play a webrtc url library. A large part of the idea is to realize data transmission and playback through websocket+webrtc, and many use EMScript to compile the c code codec library into js code.
Fortunately, I found a library that supports the srs webrtc stream, which is the open source jswebrtc. The code is as follows:
Play via html
<!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>WebRTCPlayer</title> <style type="text/css"> html, body { background-color: #111; text-align: center; } </style> </head> <body> <div class="jswebrtc" data-url="webrtc://192.168.12.187/live/1"></div> <script type="text/javascript" src="jswebrtc.min.js"></script> </body> </html>
Play through js
<!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>WebRTCPlayer</title> </head> <body> <video id="video-webrtc" controls></video> <script type="text/javascript" src="jswebrtc.min.js"></script> <script type="text/javascript"> var video = document.getElementById('video-webrtc'); var url = 'webrtc://192.168.12.187/live/1'; var player = new JSWebrtc.Player(url, { video: video, autoplay: true, onPlay: (obj) => { console.log("start play") } }); </script> </body> </html>
The playback effect is as follows:
It should be noted here that although srs provides webrtc protocol conversion, webrtc is based on udp. It may be that udp packet loss is too serious and srs has not handled it well. Therefore, if there is animation on the picture, there will be flower screen where it is basically animation!! But the real-time performance is really good, which can be almost the same as rtmp real-time protocol.
3, Expand
Webrtc is a standard protocol, unlike OCX (ActiveX) plug-ins that only support IE or Flash plug-ins that are supported by general browsers (the focus is that google's chrome browser will no longer support flash plug-ins after December 2020), so the focus of subsequent plug-in free real-time video playback falls on webrtc. Therefore, in the long run, its characteristics (no plug-ins, standard general protocol) make webrtc widely and long-term used.
Since webrtc is so important, video capture or video release function is indispensable in the field of security or internet live broadcasting, so the audio and video capture function of local cameras cannot be avoided. Here is how to collect local audio and video through webrtc protocol. The H5 code is as follows:
<!DOCTYPE html> <html> <head> <meta charset="utf-8"> <meta name="description" content="WebRTC code samples"> <meta name="viewport" content="width=device-width, user-scalable=yes, initial-scale=1, maximum-scale=1"> <meta itemprop="description" content="Client-side WebRTC code samples"> <meta itemprop="name" content="WebRTC code samples"> <meta name="mobile-web-app-capable" content="yes"> <meta id="theme-color" name="theme-color" content="#ffffff"> <base target="_blank"> <title>MediaStream Recording</title> </head> <body> <div id="container"> <h1> <span>WebRTC example-Media recorder</span> </h1> <video id="gum" playsinline autoplay muted></video> <video id="recorded" playsinline loop></video> <div> <button id="start">Turn on the camera</button> <button id="record" disabled class="off">Start recording</button> <button id="play" disabled>play</button> <button id="download" disabled>download</button> </div> <div> <h4>Media streaming constraint options:</h4> <p> Echo cancellation: <input type="checkbox" id="echoCancellation"> </p> </div> <div> <span id="errorMsg"></span> </div> </div> <script async> 'use strict'; const mediaSource = new MediaSource(); mediaSource.addEventListener('sourceopen', handleSourceOpen, false); let mediaRecorder; let recordedBlobs; let sourceBuffer; const errorMsgElement = document.querySelector('span#errorMsg'); const recordedVideo = document.querySelector('video#recorded'); const recordButton = document.querySelector('button#record'); recordButton.addEventListener('click', () => { if (recordButton.className === 'off') { startRecording(); } else { stopRecording(); recordButton.textContent = 'Start recording'; recordButton.className = 'off'; playButton.disabled = false; downloadButton.disabled = false; } }); const playButton = document.querySelector('button#play'); playButton.addEventListener('click', () => { const superBuffer = new Blob(recordedBlobs, {type: 'video/webm'}); recordedVideo.src = null; recordedVideo.srcObject = null; recordedVideo.src = window.URL.createObjectURL(superBuffer); recordedVideo.controls = true; recordedVideo.play(); }); const downloadButton = document.querySelector('button#download'); downloadButton.addEventListener('click', () => { const blob = new Blob(recordedBlobs, {type: 'video/webm'}); const url = window.URL.createObjectURL(blob); const a = document.createElement('a'); a.style.display = 'none'; a.href = url; a.download = 'test.webm'; document.body.appendChild(a); a.click(); setTimeout(() => { document.body.removeChild(a); window.URL.revokeObjectURL(url); }, 100); }); function handleSourceOpen(event) { console.log('MediaSource opened'); sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8"'); console.log('Source buffer: ', sourceBuffer); } function handleDataAvailable(event) { if (event.data && event.data.size > 0) { recordedBlobs.push(event.data); } } function startRecording() { recordedBlobs = []; let options = {mimeType: 'video/webm;codecs=vp9'}; if (!MediaRecorder.isTypeSupported(options.mimeType)) { console.error(`${options.mimeType} is not Supported`); errorMsgElement.innerHTML = `${options.mimeType} is not Supported`; options = {mimeType: 'video/webm;codecs=vp8'}; if (!MediaRecorder.isTypeSupported(options.mimeType)) { console.error(`${options.mimeType} is not Supported`); errorMsgElement.innerHTML = `${options.mimeType} is not Supported`; options = {mimeType: 'video/webm'}; if (!MediaRecorder.isTypeSupported(options.mimeType)) { console.error(`${options.mimeType} is not Supported`); errorMsgElement.innerHTML = `${options.mimeType} is not Supported`; options = {mimeType: ''}; } } } try { mediaRecorder = new MediaRecorder(window.stream, options); } catch (e) { console.error('Exception while creating MediaRecorder:', e); errorMsgElement.innerHTML = `Exception while creating MediaRecorder: ${JSON.stringify(e)}`; return; } console.log('Created MediaRecorder', mediaRecorder, 'with options', options); recordButton.textContent = 'Stop recording'; recordButton.className = 'on'; playButton.disabled = true; downloadButton.disabled = true; mediaRecorder.onstop = (event) => { console.log('Recorder stopped: ', event); }; mediaRecorder.ondataavailable = handleDataAvailable; mediaRecorder.start(10); // collect 10ms of data console.log('MediaRecorder started', mediaRecorder); } function stopRecording() { mediaRecorder.stop(); console.log('Recorded Blobs: ', recordedBlobs); } function handleSuccess(stream) { recordButton.disabled = false; console.log('getUserMedia() got stream:', stream); window.stream = stream; const gumVideo = document.querySelector('video#gum'); gumVideo.srcObject = stream; } async function init(constraints) { try { const stream = await navigator.mediaDevices.getUserMedia(constraints); handleSuccess(stream); } catch (e) { console.error('navigator.getUserMedia error:', e); errorMsgElement.innerHTML = `navigator.getUserMedia error:${e.toString()}`; } } document.querySelector('button#start').addEventListener('click', async () => { const hasEchoCancellation = document.querySelector('#echoCancellation').checked; const constraints = { audio: { echoCancellation: {exact: hasEchoCancellation} }, video: { width: 1280, height: 720 } }; console.log('Using media constraints:', constraints); await init(constraints); }); </script> <style> .hidden { display: none; } .highlight { background-color: #eee; font-size: 1.2em; margin: 0 0 30px 0; padding: 0.2em 1.5em; } .warning { color: red; font-weight: 400; } @media screen and (min-width: 1000px) { /* hack! to detect non-touch devices */ div#links a { line-height: 0.8em; } } audio { max-width: 100%; } body { font-family: 'Roboto', sans-serif; font-weight: 300; margin: 0; padding: 1em; word-break: break-word; } button { background-color: #d84a38; border: none; border-radius: 2px; color: white; font-family: 'Roboto', sans-serif; font-size: 0.8em; margin: 0 0 1em 0; padding: 0.5em 0.7em 0.6em 0.7em; } button:active { background-color: #2fcf5f; } button:hover { background-color: #cf402f; } button[disabled] { color: #ccc; } button[disabled]:hover { background-color: #d84a38; } canvas { background-color: #ccc; max-width: 100%; width: 100%; } code { font-family: 'Roboto', sans-serif; font-weight: 400; } div#container { margin: 0 auto 0 auto; max-width: 60em; padding: 1em 1.5em 1.3em 1.5em; } div#links { padding: 0.5em 0 0 0; } h1 { border-bottom: 1px solid #ccc; font-family: 'Roboto', sans-serif; font-weight: 500; margin: 0 0 0.8em 0; padding: 0 0 0.2em 0; } h2 { color: #444; font-weight: 500; } h3 { border-top: 1px solid #eee; color: #666; font-weight: 500; margin: 10px 0 10px 0; white-space: nowrap; } li { margin: 0 0 0.4em 0; } html { overflow-y: scroll; } img { border: none; max-width: 100%; } input[type=radio] { position: relative; top: -1px; } p#data { border-top: 1px dotted #666; font-family: Courier New, monospace; line-height: 1.3em; max-height: 1000px; overflow-y: auto; padding: 1em 0 0 0; } section p:last-of-type { margin: 0; } section { border-bottom: 1px solid #eee; margin: 0 0 30px 0; padding: 0 0 20px 0; } section:last-of-type { border-bottom: none; padding: 0 0 1em 0; } select { margin: 0 1em 1em 0; position: relative; top: -1px; } video { background: #222; margin: 0 0 20px 0; - -width: 100%; width: var(- -width); height: calc(var(- -width)* 0.75); } @media screen and (max-width: 450px) { h1 { font-size: 20px; } } button { margin: 0 3px 10px 0; padding-left: 2px; padding-right: 2px; width: 99px; } button:last-of-type { margin: 0; } p.borderBelow { margin: 0 0 20px 0; padding: 0 0 20px 0; } video { vertical-align: top; - -width: 25vw; width: var(- -width); height: calc(var(- -width)* 0.5625); } video:last-of-type { margin: 0 0 20px 0; } video#gumVideo { margin: 0 20px 20px 0; } </style> </body> </html>
Note: getUserMedia() can only be obtained by HTTPS encrypted communication, otherwise an error is reported:
TypeError: Cannot read property 'getUserMedia' of undefined
We can separate and simplify the above code into html and js files, then index The collection and rendering code of html is as follows:
<!DOCTYPE html> <html> <head> <meta charset="utf-8"> <title>study webrtc</title> </head> <body> <video autoplay></video> <script src="main.js"></script> </body> </html>
main.js acquisition code is as follows:
//Judge whether it supports calling device api, because different browsers have different judgment methods function hasUserMedia() { return !!(navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia); } if (hasUserMedia()) { //alert("browser support") navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia; navigator.getUserMedia({ video : true,//Open video audio : false //Turn off the audio first, because there will be a echo, and there will be no sound in the communication between the two computers in the future }, function(stream) {//Give the video stream to video var video = document.querySelector("video"); //video.src=window.URL.createObjectURL(stream); try { video.srcObject = stream; } catch (error) { video.src = window.URL.createObjectURL(stream); } }, function(err) { console.log("capturing", err) }); } else { alert("Browser does not support") }
Through the above code, we can collect the local camera and play the local collected video and audio!
For source code acquisition, cooperation and technical exchange, please obtain the following contact information:
QQ communication group: 961179337
Wechat account: lixiang6153
Official account: IT technology fast food
mail box: lixx2048@163.com