devxlogo

Exploring HTML 5’s Audio/Video Multimedia Support

Exploring HTML 5’s Audio/Video Multimedia Support

any years ago, when I was first getting started as a programmer, much of the work that I did was focused on the development of multimedia applications (programs that combined video, audio, animation, and text) in order to build presentations and computer games. I did most of this work using Macromedia Director in the early 1990s. The idea of developing even audio?let alone video?apps on the web was pretty much a pipe-dream until the debut of RealNetworks, which provided the first major streaming technology that enabled developers to send buffered media content over the Internet. Later, RealNetworks allowed embedding media content within web pages.

The idea of specific video and audio tags within HTML would have been technically impossible in HTML 3 and even somewhat infeasible in HTML 4. Because HTML 4.0 essentially was a “frozen” version, the specific mechanism for displaying content has been very much format dependent (e.g., Apple QuickTime Movies and Flash video) and usually relies upon tags with varying parameters for passing the relevant information to the server. As a result, video and audio embedding on web pages has become something of a black art .

Its perhaps not surprising then that the and tags were among the first features to be added to the HTML 5 specification, and these seem to be the first elements of the HTML 5 specification that browser vendors implemented. These particular elements are intended to enable the browser to work with both types of media in an easy-to-use manner. An included support API gives users finer-grained control.

HTML 5 and Elements

The tag is the simpler of the two, with the attributes listed in Table 1.

Table 1. Element Attributes
Attribute NameValue FormsDescription
srcxs:anyURIThis attribute gives the URL of the media source.
autobufferautobufferThis binary attribute, when present, indicates whether the user-agent (the browser) should automatically buffer the content or have the user buffer the content through the associated API.
autoplayautoplayThis binary attribute, when present, indicates whether the user-agent (the browser) should begin playing the video automatically when the page finishes loading.
looploopThis binary attribute, when present, indicates whether the user-agent (the browser) should automatically loop the content when it reaches the end of the media.
controlscontrolsThis binary attribute, when present, indicates whether the user-agent (the browser) should display controls allowing for user interaction with the resource in question.

Note that the element (and the element) also supports the general attributes that all

-based elements support ( id, style, class and so forth), even when the interface is otherwise invisible.

The element incorporates all the attributes that the element does, plus three others (see Table 2).

Table 2. Element Attributes
Attribute NameValue FormsDescription
srcxs:anyURIThis attribute gives the URL of the media source.
autobufferautobufferThis binary attribute, when present, indicates whether the user-agent (the browser) should automatically buffer the content or have the user buffer the content through the associated API.
autoplayautoplayThis binary attribute, when present, indicates whether the user-agent (the browser) should begin playing the video automatically when the page finishes loading.
looploopThis binary attribute, when present, indicates whether the user-agent (the browser) should automatically loop the content when it reaches the end of the media.
controlscontrolsThis binary attribute, when present, indicates whether the user-agent (the browser) should display controls allowing for user interaction with the resource in question.
widthdimension ###(cm|em|en|in|px|pt|%)This attribute provides the width, in the appropriate units, of the video image. If height is unspecified, it will be proportional to the height of the initial video given the width.
heightdimension ###(cm|em|en|in|px|pt|%)This attribute provides the height, in the appropriate units, of the video image. If width is unspecified, it will be proportional to the width of the initial video given the height.
posterxs:anyURIThis attribute is a link to an image, if the video either cannot be rendered or has not yet been buffered. The poster will be displayed in its place with the given proportions.

The poster is effectively a placeholder image used while the video is buffering. It’s not always necessary. A number of video codecs automatically extract a particular frame from the video to use as the poster for the video prior to downloading and front-load this (either the first frame or a random frame from mid-way through). Not all codecs are capable of this function, however.

Using a poster in this particular case is also useful for creating an alternate “brand” for the video while it’s loading, rather than relying upon the host to supply it. When the video actually starts to play, the poster image is irrelevant, even when paused. In the latter case, either the last frame that was displayed before pausing will be shown or if the video plays all the way through, the last frame of the video will be shown.

Ordinarily, video and audio formats use whatever codec the given media source is compressed and formatted in, but sometimes the browser agent does not support that particular codec. In this case, as an alternative to using the @src attribute, HTML 5 also defines a secondary element, which defines both the locations of a given resource as well as the codec type that the resource exposes. Table 3 lists the element definitions.

Table 3. Element Attributes
Attribute NameValue FormsDescription
srcxs:anyURIThis attribute gives the URL of the media source.
typeThe mime type of the resource, as a stringThis attribute contains the displayed mime-type of the media resource, usually in the form video/format.
mediaA media codec stringThis string contains the codec information applicable for the particular resource.

In the case of multiple elements, if the web browser can’t play the first codec (it’s not supported), it will go down the list until it finds one on which it can play.

Thus, the following bit of HTML 5 code shows a video element with three different codecs:

The first element describes an mp4 resource (the most typical MPEG video types, though these often have just a straight “mpg” extension), which uses the baseline profile video codec. The second is a video encoded with the simple profile video codec, while the third is encoded with the Ogg-vorbis codec. Most typically, the @type attribute will contain the associated mime-type. If it doesn’t, the @media attribute can be given with this information. Note that you have an either/or type of situation with regard to the @src attribute: it will be used preferentially even if elements exist, but you must have one or the other in your media elements.

Theoretically, the and elements should be able to handle most of the codecs currently in use. In practice, however, the browsers that do currently support these elements do so only for the open source Ogg Vorbis and Theora standards. The names may not be familiar to you?though fans of Terry Pratchett’s Diskworld series may recognize the Ogg Vorbis name from the Exquisitor Vorbis character in “Small Gods” and (perhaps) the Nanny Ogg character featured in many of his books. Theora (Jones), on the other hand, was the name of the controller of the Max Headroom character in the series of the same name.

The Ogg Vorbis standard is both open source and high fidelity, compared with the better-known MPEG formats. As such, Ogg Vorbis is a popular format for storing audio tracks for games and online applications. The HTML 5 specification does not give any preference to Ogg Vorbis/Theora over other formats, but it is the one supported by Firefox (exclusively, at this point). The Chrome and Safari teams both have announced intentions to support the two standards in addition to others.

Audio, Video, and the DOM

Both the and elements use the same DOM interface, based on the abstract HTMLMediaElement (i.e., there is no formal element). You can then use this interface to control the various video and audio streams within the page. Listing 1 shows the IDL for this interface.

From the IDL, the API for the media elements breaks cleanly into the following tasks:

  • Controlling the network retrieval of resources
  • Controlling the buffering
  • Controlling the playback
  • Setting the attributes of the various controls.

The src property is for setting the @src attribute on the media, but changing src by itself doesn’t automatically change the current video. Instead, after you change src, you then need to call the load() function to load the element with the new media, and then invoke play() when the video has loaded and finished buffering. In a purely synchronous world, this particular task would be relatively easy. If you’re pulling media off a local file with a locally running web page, this code will in fact work:

myMedia.src="http://www.mymedia.com/mediasrc.ogg";myMedia.load();myMedia.play();

However, media by its very nature is a time-spanning process. You’re working with both the need to load and cache audio and video files and the network connection, which itself can prove to be a challenge that’s usually not a factor for most non-temporal resources. For these reasons, when dealing with video and audio on the web, if you want to take control of the process in any way, you have to work asynchronously, tying into the various events that are passed back to the client from the media-serving website. Table 4 (taken directly from the HTML 5 documentation) provides a full listing of the various events and when they are dispatched.

Table 4. Media User Interface Events
Event NameInterfaceDispatched When…Preconditions
loadstartEventThe user agent begins looking for media data, as part of the resource selection algorithm.networkState equals NETWORK_LOADING
progressEventThe user agent is fetching media data.networkState equals NETWORK_LOADING
suspendEventThe user agent is intentionally not currently fetching media data, but does not have the entire media resource downloaded.networkState equals NETWORK_IDLE
abortEventThe user agent stops fetching the media data before it is completely downloaded, but not due to an error.error is an object with the code MEDIA_ERR_ABORTED. networkState equals either NETWORK_EMPTY or NETWORK_IDLE, depending on when the download was aborted.
errorEventAn error occurs while fetching the media data.error is an object with the code MEDIA_ERR_NETWORK or higher. networkState equals either NETWORK_EMPTY or NETWORK_IDLE, depending on when the download was aborted.
emptiedEventA media element whose networkState was previously not in the NETWORK_EMPTY state has just switched to that state (either because of a fatal error during load that’s about to be reported, or because the load() method was invoked while the resource selection algorithm was already running, in which case it is fired synchronously during the load() method call).networkState is NETWORK_EMPTY; all the IDL attributes are in their initial states.
stalledEventThe user agent is trying to fetch media data, but data is unexpectedly not forthcoming.networkState is NETWORK_LOADING.
playEventPlayback has begun. Fired after the play() method has returned.paused is newly false.
pauseEventPlayback has been paused. Fired after the pause method has returned.paused is newly true.
loadedmetadataEventThe user agent has just determined the duration and dimensions of the media resource.readyState is newly equal to HAVE_METADATA or greater for the first time.
loadeddataEventThe user agent can render the media data at the current playback position for the first time.readyState newly increased to HAVE_CURRENT_DATA or greater for the first time.
waitingEventPlayback has stopped because the next frame is not available, but the user agent expects that frame to become available in due course.readyState is newly equal to or less than HAVE_CURRENT_DATA, and paused is false. Either seeking is true, or the current playback position is not contained in any of the ranges in buffered. It is possible for playback to stop for two other reasons without paused being false, but those two reasons do not fire this event: maybe playback ended, or playback stopped due to errors.
playingEventPlayback has started.readyState is newly equal to or greater than HAVE_FUTURE_DATA, paused is false, seeking is false, or the current playback position is contained in one of the ranges in buffered.
canplayEventThe user agent can resume playback of the media data, but estimates that if playback were to be started now, the media resource could not be rendered at the current playback rate up to its end without having to stop for further buffering of content.readyState newly increased to HAVE_FUTURE_DATA or greater.
canplaythroughEventThe user agent estimates that if playback were to be started now, the media resource could be rendered at the current playback rate all the way to its end without having to stop for further buffering.readyState is newly equal to HAVE_ENOUGH_DATA.
seekingEventThe seeking IDL attribute changed to true and the seek operation is taking long enough that the user agent has time to fire the event. 
seekedEventThe seeking IDL attribute changed to false. 
timeupdateEventThe current playback position changed as part of normal playback or in an especially interesting way, for example discontinuously. 
endedEventPlayback has stopped because the end of the media resource was reached.the currentTime equals the end of the media resource; ended is true.
ratechangeEventEither the defaultPlaybackRate or the playbackRate attribute has just been updated. 
durationchangeEventThe duration attribute has just been updated. 
volumechangeEventEither the volume attribute or the muted attribute has changed. Fired after the relevant attribute’s setter has returned. 

Of all the events in Table 4, perhaps the most useful are the canplay and canplaythrough events. The first, canplay, will fire when enough data has been loaded for the video player to start actually rendering content constructively, even if not all of the data has been loaded. The canplaythrough event, on the other hand, will fire when the data has essentially completely loaded into the browser’s buffer, making it possible to play the video all the way through without the need to pause and retrieve more content.

Listing 2 provides an example of this: a person enters the URL to an OGV movie then presses the “Load” button.

When the video reaches a point where it has enough to play, it intercepts the oncanplay event, which enables the Play button. In turn, when the user presses the Play button, it will actually play the video just downloaded.

At least that’s the theory. In practice, the and tags are not fully implemented even in Firefox. As such, many of the events are not being passed back up to the JavaScript level yet. This means that, as crude as it is, calling load() and then play() still seems to be the best way of playing externally loaded video resources. This area is very likely to change as the implementations (and the XHTML standards) become more fully fleshed out.

Browser Vendors Very Interested

By all indications, the browser vendors view the multimedia aspects as perhaps the most crucial in the developing HTML 5 standard. Given the complexity of both types of media and the prospect of being able to better promote video and audio usage within web browsers, it’s hardly surprising. However, before HTML 5 multimedia becomes ubiquitous, the state of the art for these browser implementations still has a ways to go technically.

See also  Custom Java Web Development - The Heartbeat of Modern Web Development
devxblackblue

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.

About Our Journalist