What's the difference between html5 live video and ordinary video?

I know that ordinary videos can just write down the video address in the src of video. I have never done live videos, but should video accept the data stream from the backend when playing live videos on browsers or mobile phones? But how to do it, how to write src, and what kind of live data is transmitted from the backend?

May.10,2022

I guess your ordinary video refers to the direct playback of video source files, such as MP4, FLV, AVI , etc.

non-HTML5 era:

1. To use the < embed > tag, you need to support Flash . For example, the PC side of TX video .

<embed src="movie.swf" height="200" width="200"/>

2. Use the < object > tag.

<object data="movie.swf" height="200" width="200"/>

neither of the above can bypass Flash , and both require Flash decoders and , but now Flash players have been eliminated on mobile phones .

the age of HTML5:

3. Use the < video > tag. < video > is a new tag in HTML 5.

<video width="320" height="240" controls="controls">
  <source src="movie.mp4" type="video/mp4" />
  <source src="movie.ogg" type="video/ogg" />
  <source src="movie.webm" type="video/webm" />
Your browser does not support the video tag.
</video>
when using the < video > tag,
must convert the video to many different formats. The
< video > element is not valid in older browsers. The
< video > element cannot be validated by HTML 4 and XHTML.

in order to solve browser compatibility, there are ideojs/video.js" rel=" nofollow noreferrer "> video.js , flv.js , Chimee

.

all of these can play videos encoded by MP4, FLV, SWF directly.

push

all of the above can play the corresponding video files directly, but you need to download the video files to play them, so you can't do real-time and fast . On this basis, there are HLS and RTMP protocols, which are the mainstream push solutions at present.

the approximate process is as follows:

put the video source, such as MP4, FLV, AVI , and the index of the slice into the m3u8 file. When the user plays or fast forward, the location is calculated and the server is told to start slicing and pushing. This ensures that more personalized playback needs of users can be met with minimum bandwidth and shorter transmission time .

push solutions focus on push servers. Those who do not have the ability can use the video push services provided by cloud vendors, such as Qiniu, Tencent Video and so on.
all the browser needs to do is to decode and play.


is usually hls. You can look it up a lot on the Internet. In terms of library, our videoconferencing system uses video.js.

the general process is that the device collects the video source data, stores it in the server, then slices it, and the player loads the corresponding .m3u8 index file.


although all src of video writes links directly, live video streams need m3u8


. Simply speaking, native video tags can play media files directly, but not live streams directly, so you need some JS plug-ins, such as video.js,flv.js or hls.js, to implement

.
MySQL Query : SELECT * FROM `codeshelper`.`v9_news` WHERE status=99 AND catid='6' ORDER BY rand() LIMIT 5
MySQL Error : Disk full (/tmp/#sql-temptable-64f5-1e9f3f2-1a43.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
MySQL Errno : 1021
Message : Disk full (/tmp/#sql-temptable-64f5-1e9f3f2-1a43.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
Need Help?