The 4a audio framework does not deal with the audio itself, applications are still required to use other APIs like Pulseaudio or ALSA. An application compatible with 4a will follow these steps to start playback:
- Call the "open" action of the 4a API.
- Get a "device uri" as a response.
- Use the device uri to play the audio content using any compatible multimedia framework.
Using the mediaplayer service as an example, the relevant code is at afm-mediaplayer-binding.c, in particular, in mediaplayer_set_role_state(). When the state is GST_STATE_PLAYING, it will follow the steps above to get an ALSA device uri, then use GStreamer and pass the received uri to the ALSA sink.
In Chromium, code related to audio can be found under //media/audio. In particular, there are implementations for ALSA and Pulseaudio. If we are to follow what the mediaplayer service does, we should customize the ALSA code to interact with 4a to get the URIs.
There is also a 4a module for Pulseaudio. It acts as a bridge between Pulse and 4a; when new Pulseaudio sinks are created, it communicates them to 4a, gets the ALSA device URI, then creates an ALSA Pulseaudio sink properly configured with 4a data. We understand that this is a fallback option for non-core AGL software and should not be considered for Chromium/WAM. Still, if we wanted to try it, we should compile Chromium with `use_pulseaudio=true` (currently disabled).