One of my great weak points is technology, especially those involving movies. While I wouldn't go as far as calling myself a cinephile (e.g., I don't know every random fact about my favorite movies), I do love watching films which means I like trying to make my viewing experience at home rather nice (which honestly has not been that great thanks to my long-lasting student budget).
One of the latest things being debated about the quality of cinema is the frame rate of films. Currently movies project at 24 fps. ATSC (digital broadcasting version of NTSC) is typically 30 fps at 1080p although it can also do 24 fps and the funky rates film and old NTSC video show at (PAL is 25 fps but I'm North American, so I am going to stick with what I know in this blog post). Blu-ray can do 24 fps at 1080p. So everything these days can handle the native projection rate of 24 fps of 35 mm film.
But really, 24 fps?!? Any hardcore gamer is going to tell you that frame rate is unacceptable. You want at least 30, if not 60 fps. As it turns out, so do people watching movies. This is why most decent TVs these days at least output 120Hz video which helps to artificially raise the frame rate and lead to a smoother picture, no matter if the original material is 24 or 30 fps (or even 60, but that is typically reserved for video game output). And even newer TVs can do 240Hz, which leads to an even smoother experience (supposedly; have not experienced it myself).
But what about in the theater? Everything is still filmed and made for 24 fps and that is what projectors (digital and analog) use. Well, two of the more cutting edge directors currently working want to change this. James Cameron has talked about filming Avatar 2 & 3 at 48 or 60 fps. Peter Jackson has gone beyond saying and is actually doing by filming the Hobbit at 48 fps. According to Jackson the higher frame rate not only looks better, but it makes 3D something that is actually bearable to watch (i.e., the nasty flicker goes away). And it turns out most digital projectors can have their firmware upgraded to project films that use these higher frame rates. And with 48 fps being a multiple of 24, these directors could easily film at 48 fps and make 35 mm prints where every other frame is dropped so old projectors can still show at 24 fps.
But what about watching at home? While it's great that seeing those select films I want to see in the theaters will be projected at a higher frame rate (e.g., the Hobbit), what about when I want to watch at home? This is where that 48 fps gets sticky. Both blu-ray and ATSC do not support 48 fps at any resolution. Both do, however, support 60 fps at 720p. And I am willing to bet that Netflix streaming won't handle either 48 or 60 fps until we all have fiber connections. Another issue for the 48 fps is that it is not a common divisor of 120 (Hz), making it potentially nasty to artificially raise the frame rate; luckily 48 is a divisor for 240 (Hz).
So the question becomes, how does Hollywood decide to handle this theater/home frame rate discrepancy that is coming our way? Well, with this being Hollywood, I am willing to bet they do it in the way that screws over the home user the most, forcing you to go to the theater for the best experience. That would support the idea of going with 48 fps in theater projection. That way they still get a higher frame rate that helps out 3D projection, they can easily cut 24 fps versions for 35 mm prints, and they can stick with blu-ray and such at 24 fps as well. All of this while getting to keep a nice hook to get people who care about video quality to come to the theaters to pay for your expensive movie ticket. They can then come out with a new home video standard that supports 48 fps and force us to all buy our movie collections yet again (you know George would be happy to sell you Star Wars for the bazillionth time on a new format, although unless he lets Han shoot first I ain't going for it). If they go 60 fps they can at least do 720p60, but then again do we really want the drop in resolution?