Turned off the "feature" as soon as Tv was on. Have been advising friends and adjusting their expensive oled TV's as well. In every case they were grateful for me "fixing" their TVs which only took a few minutes.
People just don't want to muddle around in the settings. And here lies the problem. It would be great if manufacturers would open up their interface settings to be influenced more easily by mediaplayers or settup boxes. Those in turn, could have a ruleset: if 24p, then disable motion blur if else enable motion blur. If sports enable motion blur. And so on.
It's not even limited to motion blur, I have low light and normal light profiles I'd like to have changed automatically based on the actual light in my room.
Motion smoothing is designed to reduce motion blur. It doesn't always work, because it's just heuristic processing that attempts to estimate the missing information, but when the motion is easily predictable (pans are an obvious case), and the exposure time (shutter angle) is low, it can do an excellent job of interpolating the missing frames. This eliminates the sample-and-hold blur[0] that would otherwise appear. It's motion sharpening, not motion blur.
I work in the lighting control industry. Your final paragraph piques my interest. I’m curious, why automatically change the TV if you could automatically change the room? Shades drop, light intensity lowers, CCT warms, etc...?
This whole thread and article is like it's from an alternative universe to me. All of my tech-savvy friends go to great lengths to turn motion smoothing on everywhere ("smooth video project" is the best I know for PC) they can so that panning shots don't look awful. They wouldn't play videogames at 24Hz, so why should they watch a film at 24Hz?
It does introduce artefacts. The solution to all of this is for filmmakers to move on from the 1930s and film at 120 fps or higher. If people only want to watch a quarter of the frames, it can be downsampled easily.
The argument reminds me of musicians who didn't want their albums chopped up and sold as individual tracks, to be consumed however listeners desired.
Panning shots probably look awful if you're watching 24 fps content on a device running at 60Hz (many consumer devices, computers, and Blu Ray players do this by default). Since 24 frames can't be evenly spread out over 60 frames, you'll see motion judder, and it does look terrible.
However, 24 fps running at a true 24Hz is perfectly fine, and you will not see flicker - at least not if produced by competent filmmakers. That's because the camera's shutter angle is normally set such that motion blur makes the motion look smooth. Real cinematographers also know exactly how fast they can pan with a given lens without causing any noticeable strobing motion.
Comparisons to video games completely miss the point. Most video games are unable to properly simulate motion blur (some try, but it doesn't usually work well), so you have to have high frame rates for things to look smooth. Since you need to react quickly in video games, high frame rates (combined with the lack of motion blur) are also helpful in preserving crisp edges on moving objects (so it's a practical advantage). And finally, video games generally try to simulate reality for the player, so players are more concerned that the technology makes the experience believable rather than intentionally suspending disbelief (as in film or theater) to passively unpack the narrative nuance of artwork unfolding on a screen.
For those same reasons, sports also do well with higher frame rates, but when fiction (which is obviously disconnected with the present reality) tries to use high frame rates, it falls into "uncanny valley" territory - much like the computer-generated humans of movies like "The Polar Express". As others have noted, a few directors have really tried to push HFR film to the public, but it has never been received well (whether or not HFR is recognized as being responsible for the uneasiness felt by audiences watching HFR content).
As for HFR content being "easily" downsampled - it really isn't, at least not with respect to motion blur (which is an essential point that many people miss in these discussions).
The ultimate solution here, I think, is to allow creators to encode the desired setting, frame rate, or content category in the metadata so that a user can benefit from the technology when watching sports without it detracting from a film later on. Then the default setting could be "auto" and diehards could override one way or the other.
I'm glad to see a brief tip to that solution in the article and hope it becomes part of an industry specification.
"looks like a soap opera" exposes everything you need to know about the ignorance behind this position.
Reality does not judder.
If you want judder for some artistic purpose, you can have that, just like any other reality-distorting filter.
Some shots are intentionally blurry for valid artistic purposes. Should all movies and tvs be blurry at all times? Some shots are intentionally monochrome for valid artistic purposes. Should all movies and all tvs be b&w or red tinted at all times? Some shots are intentionally too bright or too dark, for valid artistic purposes... You want stroboscopic stop-motion, go ahead and use it.
Superior reproductive technology in no way prevents you from creating a scene that has that and any other kind of artificially applied crappiness like static interference or low resolution or analog scan lines... But crappy reproductive technology does preclude the other 99.9% of the time when you don't want an artificially crappy reproduction.
Video tape was cheaper than film, and tvs and video tape ran at higher frame rates than film, and soap operas were cheaper than movies, and so the one superior aspect of soap operas became associated with "cheap".
It's a freaking ignorant assosciation is all. It's obvious as hell, and you have to be a moron not to recognize why you think "this looks like a soap opera" and see past that.
> Superior reproductive technology in no way prevents you from creating a scene that has that and any other kind of artificially applied crappiness...
The technology to produce movies at high frame rates has been around for a long time (and projection/display technology could have supported it much sooner if the content were there), and yet directors have deliberately, almost universally chosen what you categorize as "artificially applied crappiness" (24 fps). Practically speaking, it is artificially applied, and that's exactly it - it's an intentionally distorted version of reality, and it's that way on purpose. You're free to attempt to remove that filter (by smoothing motion on your TV), but that doesn't represent what the director intended.
Even with "superior reproductive technology", directors are still choosing to produce films at 24 fps, so can you at least appreciate and respect that that's how most of them intend their work to be displayed? True, "reality does not judder" (proper 24 fps content doesn't either, by the way), but reality isn't what filmmaking is about. To argue otherwise misses the point of film as an artistic medium. For other types of content, I agree - reality is the target, but not in film.
Seriously, if people are this determined to tie themselves into knots trying to defend their ridiculous bias that smooth motion = soap opera, it kind of makes of despair about whether its actually possible to change anybody's mind about anything ever.
Luckily the fact that tv's with motion smoothing are selling suggests that the rest of us have spent the 2 minutes necessary to get used to it and realized that it's superior.
Neither do 24fps movies, except when you play them back on home equipment that doesn't display at an integer multiple of 24fps.
Judder is an artifact of the difference between home and theatrical display and preserving it (and giving alternatives a bad rep) is a deliberate attempt to keep the theatrical experience as a premium venue for viewing studio output.
Fighting for judder isn't about preserving anyone's artisitic vision. And the people saying it is aren't even good liars.
The 24 FPS movie rate premise is that the human eye can’t detect the difference of higher frame rates. But the higher frame rate TV demos look amazing and movies that are resampled look like crap to people because they CAN see at more than 24 FPS.
I’m not saying that the resampling is right, but I am super questioning this film rate in a digital age.
I’m also questioning the algorithm that just creates a messy blur of nonsense between two frames. Surely we can create a smarter algorithm
This isn’t true really, it’s been known since before the dawn of cinema that humans can appreciate higher frame rates.
24FPS was chosen as much for economic reasons as anything else - it’s one of the slowest frame rates cinema could get away with while still delivering a satisfactory experience.
Remember that higher frame rates meant higher consumption of expensive film before the digital era. Lowering the FPS to the lowest you could reasonably get away with reduced the cost of materials significantly.
That it’s still popular today is at least partly testament to the conditioning effect of decades of 24FPS movie consumption. Audiences often find the higher FPS look and “feel” strange, at least in a cinema context, such as the backlash the 48 FPS cinema presentation of the Hobbit movies received.
I strongly believe that if industry had settled on say 48FPS as the standard we’d be conditioned to “expect” that look and complain that 24FPS looks strange.
The best software for this I’ve seen is Tachyon but it’s way above consumer technology level for cost and compute resources. https://cinnafilm.com/product/tachyon/
Like the interface and OS I assume the TV manufacturers are mostly using cheap versions of what they think they can get away with without pushing up their costs.
Agreed. It might seem weird at first, but I think that in the end it'll be like transitioning from B&W to color. Our brains will adjust and we'll realize that we were missing out on all that fidelity.
60fps changes your perception of a movie. For some reason ~24fps you get the cinematic feeling, with 60fps you get "amateurish" videographer feeling. Not sure what's the science behind that.
What TV manufacturers are doing is unacceptable. It introduces strange artifacts and the motion looks very artificial compared to a native 60fps video. So it really doesn't solve anything.
The argument that it's the filmmakers' fault for using 24fps is flawed. What about animation? What about stop motion? Those may never be produced at a higher framerate. Movies with high VFX budgets will be much more expensive due to the additional frames when rotoscoping and compositing.
So if it is inferior and only makes sense for a subset of productions, why is it the default?
In many cases the bulk of the VFX budget is artist wages, not CPU time. If you already have the motion paths set up then it doesn't take much more human effort to render more intermediate frames. You might even be able to reduce the cost per frame by exploiting the smaller differences between frames, e.g. with temporal noise reduction.
Hand drawn or hand posed animation is the rare case where increased frame-rate really would take more human effort, but it's often already produced at extra-low framerate (e.g. 12fps or 8fps), with frames duplicated for playback. The same duplication can happen with higher playback framerates. Modern video codecs handle duplicated frames well.
One problem is that TVs run at 60fps and don't support adaptive sync. 60 is not evenly divisible by 24. Without some kind of trick you're padding to 30fps by duplicating every 4th frame, or trimming to 20fps by dropping every 6th frame.
Padding is usually what's done. You can see it in NTSC (30fps) DVDs which come from something originally produced for PAL (25fps). Every 5th frame is duplicated.
It makes me curious what framerate movie projectors run at. Maybe the "cinema experience" includes goofy frame duplication and nobody realizes it.
Traditionally, each film frame is projected 3 times. The psycho-perceptual reason is that the phi-phenomenon (“light chaser effect”) kicks in about 12–15 Hz, so camera frame rate must exceed that to give a perception of smooth motion. The critical flicker frequency is around 50 Hz for most people, so projected frame rate must exceed that to avoid the appearance of flickering.
Silent movies were shot 18 fps and projected 54 fps. The frame rate for talkies was boosted to allow enough bandwidth for the audio. The reason silent movies look jerky today is every third frame is projected twice. They are smooth when properly projected.
My kids aren’t bothered by high frame rates, so personally I suspect it only bothers adults who are used to 24fps.
> And an entire cinematic language has developed around the rate of 24 frames per second — the way actors perform, the way shots are composed and cut and cameras move. (This is why an awards show or a news broadcast shot on video at a higher frame rate looks and feels different from a film.)
This is a real stretch. Very, very few filmmakers are doing anything specific for 24fps that they wouldn’t do in 48 or 60. If they did, they’d slow things down so you could see them, but instead we have ever faster and faster fight sequences in Marvel and Transformer movies where you can’t even see the details during the action.
The “cinematic language” of a news show vs an action movie is different, but has almost nothing to do with frame rate, and if frame rate was the main issue, we’d be doing news in low frame rate and action movies in high frame rate.
Horizontal pans in 24fps have started to drive me crazy. Films do it all the time, and you can barely see anything while that’s going on. Higher frame rate pans are so much easier to watch, even if they make the movie look like BBC tv.
So this is what made me feel watching the new smart TVs at the showroom like it was "too realistic" and "less cinematic". I wrongly thought it was a side-effect of super high resolution - didn't even know that this was a thing before I read this piece.
This has no value except a being a gimmick which can be used to say "look how smooth/real it is". Worse is to set this to be "on" by default. They should have just had a button on the remote with a marketing-inspired name, like "motion flow" or something, which people could have turned on if they wanted this.
This is exactly the reason, and it makes everything look like a horrible soap opera. 24fps and motion blur, is a lot of what makes movies, look like movies. Otherwise it turns into someones home video.
For decades broadcasters have converted movies to NTSC video using three-two pull down, where each movie frame is converted to either three or two video fields in order to convert the 24 frame rate to the 60 field rate (well, actually 29.97 frames/second to 59.94 fields/second). https://en.wikipedia.org/wiki/Three-two_pull_down
How can the new smoothing conversion be worse than that?
That is copying the original frames pixel for pixel. Motion smoothing involves interpolation heuristics (if you're lucky) between two original frames, creating an artificial frame that the film creator never intended nor vetted.
To me it's like music creators telling people not to listen to their music on v-shaped headphones. Yeah, it's not what you intended. But that's what they've chosen to use.
Also, if they want the picture to be displayed in a certain way, couldn't they push they movie as 60/frames per second where in fact it's 24/frames per second with static frames? Wouldn't that effectively disable motion smoothing?
Nobody’s telling anyone to do anything. The problem people are talking about is that this setting looks like absolute garbage and it’s inexplicably turned on by default on almost all TVs these days. The only people who think otherwise are people who just literally do not care at all about picture quality and probably also do stuff like stretching the picture to get rid of letterboxing. These are not the people that default settings should be based on.
And to me it is like a painter complaining about a museum which shines a RGB-Disco-Light onto a piece of work whose colors have been purposefully considered.
Music as well as Films are pieces of work who are only really existing in their projection, a playthrough or a concert. Everything that mangles with that projection without beeing a willful choice of those projecting it makes it harder for artists to speak to an audience in the way they intended.
There are perfectly fine reasons to change a projection, e.g. for the or visually impaired or because you just prefer it that way, but ultimately it should be a willful decision to deviate from the expected default..
That has its own issues - you can’t completely “smoothly” convert 24FPS source material to 60FPS, and many viewers can detect the uneven frame cadence when this is done due to the inconsistent introduction of extra frames to get it to fit the 60FPS cadence. You can often see it in slow panning shots, were it introduces a perceptible judder to the video.
While the Wikipedia article on 3:2 pulldowns mainly discusses converting 24FPS film to 30/60 FPS broadcast, the same conversion is virtually always done on digital material as well, in software at playback time or by the video codec changing the frame rate.
As mrob also writes, you only get smooth frame rate changes usually if the new frame rate is an integer multiple of the original one, as then the extra frames introduced is consistent. This is why 30FPS video doesn’t have this problem at 60hz, but 24FPS does.
60 isn't an integer multiple of 24, so it would result in uneven frame timing. It would look worse than playing 24fps with every frame shown for the same time, which modern TVs can do.
I must admit, I might not even care that much if the processing weren't so flawed. What drives me mad is those weird patterns or glitches, e.g. when someone walks in front of a background with a high contrast pattern on it while the camera pans along while zooming out slightly. It usually creates very visible artifacts around the person.
A year or two ago, the TV my roommates and I had would give weird artefacts while watching sports. I wasn't sure if it was the TV or the broadcast but the effects were things like people's eyes would move relative to their head or the logo on the shirt would move relative to the shirt. They'd sort of bounce up and down. Is this motion smoothing?
My guess at the time was it was an image stabilization algorithm put on by the studio since sports broadcasts involve long-range videography. Presumably the logo or eyeballs were getting "stabilized" in place while the rest of the person wasn't, making them appear to jiggle up and up down compared to the person's movement. It was eerie. But these were major sports streams, like world series baseball, and I'd be surprised if they messed it up that badly.
I don't get it, what's the "debate"? According to the article there is literally no advantage to motion smoothing. When motion smoothing is off, if the video is shot at high frame rates, like TV or sports, it displays them that way as people like, and if the video is of film at 24 fps, it displays it like that as people also prefer. The article seems to try to engineer controversy by vaguely hinting that sales could be lost if smoothing weren't on for certain key edge cases, like in-store demos, but that's not the case. So the question is why is ti default at all, what's the issue?
I remember when a friend of mine got their first HD TV and Blu-ray, none of us had ever seen that stuff. We were still used to CRT screens, and at best DVD.
We borrowed a movie, fired it up - and everyone thought there was something wrong with the movie. The image quality looked almost like something from handheld camera, just very non-cinematic. We returned the movie, and went with another one.
Nope, same problem. We just figured that's the way HD looked like.
Excuse me for being ignorant, I don't really watch much TV, but:
Why do they need this feature for sports? Couldn't they broadcast the sports at 60Hz? If they don't broadcast sports at 60Hz, then what framerate are they broadcasting it at, and what are the 60Hz TV's intended for in the first place?
60fps is still too low for very fast motion, of which sports is a notable example. And the focal point of most sports is a ball, which moves along predictable paths, so motion smoothing can do a good job of increasing the frame rate to the point where the sample-and-hold blur of 60fps isn't visible.
[+] [-] wernerb|6 years ago|reply
People just don't want to muddle around in the settings. And here lies the problem. It would be great if manufacturers would open up their interface settings to be influenced more easily by mediaplayers or settup boxes. Those in turn, could have a ruleset: if 24p, then disable motion blur if else enable motion blur. If sports enable motion blur. And so on.
It's not even limited to motion blur, I have low light and normal light profiles I'd like to have changed automatically based on the actual light in my room.
[+] [-] mrob|6 years ago|reply
[0] Blur Busters has a good introduction, that's relevant to all non-flickering displays, not just OLED: https://www.blurbusters.com/faq/oled-motion-blur/
[+] [-] nemosaltat|6 years ago|reply
[+] [-] NullPrefix|6 years ago|reply
[+] [-] bovine3dom|6 years ago|reply
It does introduce artefacts. The solution to all of this is for filmmakers to move on from the 1930s and film at 120 fps or higher. If people only want to watch a quarter of the frames, it can be downsampled easily.
The argument reminds me of musicians who didn't want their albums chopped up and sold as individual tracks, to be consumed however listeners desired.
[+] [-] dperfect|6 years ago|reply
However, 24 fps running at a true 24Hz is perfectly fine, and you will not see flicker - at least not if produced by competent filmmakers. That's because the camera's shutter angle is normally set such that motion blur makes the motion look smooth. Real cinematographers also know exactly how fast they can pan with a given lens without causing any noticeable strobing motion.
Comparisons to video games completely miss the point. Most video games are unable to properly simulate motion blur (some try, but it doesn't usually work well), so you have to have high frame rates for things to look smooth. Since you need to react quickly in video games, high frame rates (combined with the lack of motion blur) are also helpful in preserving crisp edges on moving objects (so it's a practical advantage). And finally, video games generally try to simulate reality for the player, so players are more concerned that the technology makes the experience believable rather than intentionally suspending disbelief (as in film or theater) to passively unpack the narrative nuance of artwork unfolding on a screen.
For those same reasons, sports also do well with higher frame rates, but when fiction (which is obviously disconnected with the present reality) tries to use high frame rates, it falls into "uncanny valley" territory - much like the computer-generated humans of movies like "The Polar Express". As others have noted, a few directors have really tried to push HFR film to the public, but it has never been received well (whether or not HFR is recognized as being responsible for the uneasiness felt by audiences watching HFR content).
As for HFR content being "easily" downsampled - it really isn't, at least not with respect to motion blur (which is an essential point that many people miss in these discussions).
[+] [-] dondawest|6 years ago|reply
Filmmakers filming in 120fps would destroy the film industry overnight because NO ONE would watch what they produced.
[+] [-] DogOnTheWeb|6 years ago|reply
I'm glad to see a brief tip to that solution in the article and hope it becomes part of an industry specification.
[+] [-] Brian_K_White|6 years ago|reply
Reality does not judder.
If you want judder for some artistic purpose, you can have that, just like any other reality-distorting filter.
Some shots are intentionally blurry for valid artistic purposes. Should all movies and tvs be blurry at all times? Some shots are intentionally monochrome for valid artistic purposes. Should all movies and all tvs be b&w or red tinted at all times? Some shots are intentionally too bright or too dark, for valid artistic purposes... You want stroboscopic stop-motion, go ahead and use it.
Superior reproductive technology in no way prevents you from creating a scene that has that and any other kind of artificially applied crappiness like static interference or low resolution or analog scan lines... But crappy reproductive technology does preclude the other 99.9% of the time when you don't want an artificially crappy reproduction.
Video tape was cheaper than film, and tvs and video tape ran at higher frame rates than film, and soap operas were cheaper than movies, and so the one superior aspect of soap operas became associated with "cheap".
It's a freaking ignorant assosciation is all. It's obvious as hell, and you have to be a moron not to recognize why you think "this looks like a soap opera" and see past that.
[+] [-] dperfect|6 years ago|reply
The technology to produce movies at high frame rates has been around for a long time (and projection/display technology could have supported it much sooner if the content were there), and yet directors have deliberately, almost universally chosen what you categorize as "artificially applied crappiness" (24 fps). Practically speaking, it is artificially applied, and that's exactly it - it's an intentionally distorted version of reality, and it's that way on purpose. You're free to attempt to remove that filter (by smoothing motion on your TV), but that doesn't represent what the director intended.
Even with "superior reproductive technology", directors are still choosing to produce films at 24 fps, so can you at least appreciate and respect that that's how most of them intend their work to be displayed? True, "reality does not judder" (proper 24 fps content doesn't either, by the way), but reality isn't what filmmaking is about. To argue otherwise misses the point of film as an artistic medium. For other types of content, I agree - reality is the target, but not in film.
[+] [-] resoluteteeth|6 years ago|reply
Luckily the fact that tv's with motion smoothing are selling suggests that the rest of us have spent the 2 minutes necessary to get used to it and realized that it's superior.
[+] [-] dragonwriter|6 years ago|reply
Neither do 24fps movies, except when you play them back on home equipment that doesn't display at an integer multiple of 24fps.
Judder is an artifact of the difference between home and theatrical display and preserving it (and giving alternatives a bad rep) is a deliberate attempt to keep the theatrical experience as a premium venue for viewing studio output.
Fighting for judder isn't about preserving anyone's artisitic vision. And the people saying it is aren't even good liars.
[+] [-] codefreakxff|6 years ago|reply
I’m not saying that the resampling is right, but I am super questioning this film rate in a digital age.
I’m also questioning the algorithm that just creates a messy blur of nonsense between two frames. Surely we can create a smarter algorithm
[+] [-] giobox|6 years ago|reply
24FPS was chosen as much for economic reasons as anything else - it’s one of the slowest frame rates cinema could get away with while still delivering a satisfactory experience.
Remember that higher frame rates meant higher consumption of expensive film before the digital era. Lowering the FPS to the lowest you could reasonably get away with reduced the cost of materials significantly.
That it’s still popular today is at least partly testament to the conditioning effect of decades of 24FPS movie consumption. Audiences often find the higher FPS look and “feel” strange, at least in a cinema context, such as the backlash the 48 FPS cinema presentation of the Hobbit movies received.
I strongly believe that if industry had settled on say 48FPS as the standard we’d be conditioned to “expect” that look and complain that 24FPS looks strange.
[+] [-] jim-a-1020401|6 years ago|reply
Like the interface and OS I assume the TV manufacturers are mostly using cheap versions of what they think they can get away with without pushing up their costs.
[+] [-] 33mhz|6 years ago|reply
[+] [-] rowanG077|6 years ago|reply
[+] [-] psychometry|6 years ago|reply
[+] [-] bitL|6 years ago|reply
[+] [-] sneak|6 years ago|reply
[+] [-] korm|6 years ago|reply
The argument that it's the filmmakers' fault for using 24fps is flawed. What about animation? What about stop motion? Those may never be produced at a higher framerate. Movies with high VFX budgets will be much more expensive due to the additional frames when rotoscoping and compositing.
So if it is inferior and only makes sense for a subset of productions, why is it the default?
[+] [-] mrob|6 years ago|reply
Hand drawn or hand posed animation is the rare case where increased frame-rate really would take more human effort, but it's often already produced at extra-low framerate (e.g. 12fps or 8fps), with frames duplicated for playback. The same duplication can happen with higher playback framerates. Modern video codecs handle duplicated frames well.
[+] [-] amelius|6 years ago|reply
[+] [-] discreditable|6 years ago|reply
Padding is usually what's done. You can see it in NTSC (30fps) DVDs which come from something originally produced for PAL (25fps). Every 5th frame is duplicated.
It makes me curious what framerate movie projectors run at. Maybe the "cinema experience" includes goofy frame duplication and nobody realizes it.
[+] [-] dbcurtis|6 years ago|reply
Silent movies were shot 18 fps and projected 54 fps. The frame rate for talkies was boosted to allow enough bandwidth for the audio. The reason silent movies look jerky today is every third frame is projected twice. They are smooth when properly projected.
[+] [-] dahart|6 years ago|reply
> And an entire cinematic language has developed around the rate of 24 frames per second — the way actors perform, the way shots are composed and cut and cameras move. (This is why an awards show or a news broadcast shot on video at a higher frame rate looks and feels different from a film.)
This is a real stretch. Very, very few filmmakers are doing anything specific for 24fps that they wouldn’t do in 48 or 60. If they did, they’d slow things down so you could see them, but instead we have ever faster and faster fight sequences in Marvel and Transformer movies where you can’t even see the details during the action.
The “cinematic language” of a news show vs an action movie is different, but has almost nothing to do with frame rate, and if frame rate was the main issue, we’d be doing news in low frame rate and action movies in high frame rate.
Horizontal pans in 24fps have started to drive me crazy. Films do it all the time, and you can barely see anything while that’s going on. Higher frame rate pans are so much easier to watch, even if they make the movie look like BBC tv.
[+] [-] soulofmischief|6 years ago|reply
[+] [-] sneak|6 years ago|reply
[+] [-] jim-a-1020401|6 years ago|reply
[+] [-] MikusR|6 years ago|reply
[+] [-] noisy_boy|6 years ago|reply
This has no value except a being a gimmick which can be used to say "look how smooth/real it is". Worse is to set this to be "on" by default. They should have just had a button on the remote with a marketing-inspired name, like "motion flow" or something, which people could have turned on if they wanted this.
[+] [-] brokenmachine|6 years ago|reply
I can't help myself and I end up hunting for artifacts to be annoyed by more than enjoying the show.
Every TV that uses motion interpolation is the same, it's horrible.
[+] [-] overcast|6 years ago|reply
[+] [-] Merrill|6 years ago|reply
How can the new smoothing conversion be worse than that?
[+] [-] sbergot|6 years ago|reply
[+] [-] ryanjshaw|6 years ago|reply
[+] [-] comboy|6 years ago|reply
Also, if they want the picture to be displayed in a certain way, couldn't they push they movie as 60/frames per second where in fact it's 24/frames per second with static frames? Wouldn't that effectively disable motion smoothing?
[+] [-] mwfunk|6 years ago|reply
[+] [-] atoav|6 years ago|reply
Music as well as Films are pieces of work who are only really existing in their projection, a playthrough or a concert. Everything that mangles with that projection without beeing a willful choice of those projecting it makes it harder for artists to speak to an audience in the way they intended.
There are perfectly fine reasons to change a projection, e.g. for the or visually impaired or because you just prefer it that way, but ultimately it should be a willful decision to deviate from the expected default..
[+] [-] giobox|6 years ago|reply
While the Wikipedia article on 3:2 pulldowns mainly discusses converting 24FPS film to 30/60 FPS broadcast, the same conversion is virtually always done on digital material as well, in software at playback time or by the video codec changing the frame rate.
As mrob also writes, you only get smooth frame rate changes usually if the new frame rate is an integer multiple of the original one, as then the extra frames introduced is consistent. This is why 30FPS video doesn’t have this problem at 60hz, but 24FPS does.
> https://en.wikipedia.org/wiki/Three-two_pull_down
[+] [-] mrob|6 years ago|reply
[+] [-] jacobush|6 years ago|reply
[+] [-] iforgotpassword|6 years ago|reply
[+] [-] tgb|6 years ago|reply
My guess at the time was it was an image stabilization algorithm put on by the studio since sports broadcasts involve long-range videography. Presumably the logo or eyeballs were getting "stabilized" in place while the rest of the person wasn't, making them appear to jiggle up and up down compared to the person's movement. It was eerie. But these were major sports streams, like world series baseball, and I'd be surprised if they messed it up that badly.
[+] [-] brokenmachine|6 years ago|reply
[+] [-] eddyg|6 years ago|reply
https://frames-per-second.appspot.com
[+] [-] thinkloop|6 years ago|reply
[+] [-] TrackerFF|6 years ago|reply
We borrowed a movie, fired it up - and everyone thought there was something wrong with the movie. The image quality looked almost like something from handheld camera, just very non-cinematic. We returned the movie, and went with another one.
Nope, same problem. We just figured that's the way HD looked like.
[+] [-] Aardwolf|6 years ago|reply
Why do they need this feature for sports? Couldn't they broadcast the sports at 60Hz? If they don't broadcast sports at 60Hz, then what framerate are they broadcasting it at, and what are the 60Hz TV's intended for in the first place?
[+] [-] mrob|6 years ago|reply