Media player (MP) reads file from disk (or media, network path etc). This will be at least the header details (first part of the file, which contains details like user metadata, encoding format etc), and usually read the media payload in chunks at a time. Streaming is really only different here in that the chunks will be smaller and more frequent. If the file can’t be read, or the MP can’t determine the format/encoding, it will error out.
MP determines the file encoding and loads the relevant software libraries to decode it (decompress it and convert the file data to pixels/audio frequencies). Eg. if it detects the file is MP3 it will load the MP3 decoder. The file extension might be used as a hint to load the correct decoder but any decent media player will look at the file header to check it. At this point if the decoding library doesn’t understand data it will throw an error which will probably be escalated to the MP to handle.
Depending on the type of file detected, MP will show appropriate controls. Images or video will be displayed on a canvas control. Video or audio (and possibly animated images eg. GIF) will show player controls. Anything with audio will show/enable volume control etc.
Decoding library decodes the data as it’s played, what you end up with is an output of raw image/video or audio data. These get routed through the operating system, hardware drivers, onto the relevant hardware components (GPU for graphics, APU for audio) and ultimately to the output device (monitor, speakers).
The above is for software decoding. In the case of Hardware Acceleration, the encoded data is sent straight to the relevant hardware components (via exposed APIs in the operating system/drivers) for direct decoding and output.
Latest Answers