We are using Javascript to access the API using speakSsmlAsync on the SpeechSynthesizer. We are expecting mp3 files. When we try to play these, in most software they don't play (QuickTime for example).
We are setting the audioConfig like this
const audioConfig = AudioConfig.fromAudioFileOutput(filename);
where filename is something like my-file.mp3
When I try and open in Handbrake I get errors like this:
Input #0, wav, from 'my-file.mp3':
Duration: 00:00:02.31, bitrate: 256 kb/s
Stream #0:0: Audio: pcm_s16le ([1][0][0][0] / 0x0001), 16000 Hz, 1 channels, s16, 256 kb/s
[16:24:03] hb_stream_open: open my-file.mp3 failed
[16:24:03] scan: unrecognized file type
[16:24:03] libhb: scan thread found 0 valid title(s)
[16:24:03] macgui: ScanCore scan done
Which makes me think it's not encoded properly.
If I change the extension to .wav, it will play (although it still says it's invalid).
So
- what are we doing wrong?
- is there a way to specify the output format / rate explicitly by creating our own AudioConfig? We couldn't figure out how to do that.
Thanks!