I'm working on a script to add a working RSTM header but I'm having problems to calculate the sample count. The variables I have are:
- frequency
- file size
- channels
- data rate (kbps)
How does the sample count calculate from these? I thought it's just (file size * freq) / data rate but apparently I'm wrong. :-\
For example: 44.100Hz stereo, 100kbps, 2665472 bytes. How many samples does this file have?
Sorry for this n00by question...
But thanks for your answers!