Got a question about technology? Ask it here. Discussion of hardware, software, TiVos, multi-region DVDs, Windows, Macs, LINUX, hand-helds, iPods, anything tech related. Better than any helpdesk!
Can anyone think of a reason why Quicktime 10.1 would be telling me a video has a bitrate of 4.3Mbps (double what it should be) and Quicktime 10.0 is telling me 2.3Mbps (what I'd encoded it at)???
I need to give a client an answer more technical than "Quicktime 10 is fucked up, I don't even."
I am living this xkcd right now: [link]
Can anyone think of a reason why Quicktime 10.1 would be telling me a video has a bitrate of 4.3Mbps (double what it should be) and Quicktime 10.0 is telling me 2.3Mbps (what I'd encoded it at)???
One's a mono-rate and the other stereo?
One's a mono-rate and the other stereo?
That would only bump it up by the audio bitrate, which is 128kbps.
I am living this xkcd right now: [link]
I have SO been there, Jessica.
What is the size of the file in megabytes? What length of time is it?
I am living this xkcd right now: [link]
Yeah, I've been there a few times....
Me too. I swear this fucking thing happens to me at least once a month.
Usually I find I have to change up my search terms to get the solution I need.