Another aspect of this is staffing. It doesn't immediately seem tech related, but it is.
There's less people in the background as extras, less people making background scenes or setting up. It can mostly filled in by production technology if necessary.
Some movies just seem so empty, and they look "cheap" if you see it in this angle. Some big hollywood production like for example Passengers where there are barely more than three actors visible in the whole film. I've noticed there's a lot of these hollow/empty productions with very few actual actors visible, and they are rather lifeless.
That can't be the result of using LED ligthing, because modern LEDs are really good and don't require much correction in post, yet movies still look like that. In fact, I don't think you can pinpoint this to a single technical reason.
It's just the industry converged onto the same recipe book - some of that is cost cutting, some is just massive amounts of A/B testing throughout the years.
Demand leads to optimization, which in turn leads to convergence and less varied supply. Which is kind of paradoxical, considering the low barrier and creative freedom available with today's tools. And yes, it really started to happen somewhere in early 2000s, not just with movies and the looks but with everything related to visual entertainment, from games to illustrations.
The number one thing that bothers me the most in the "netflix aesthetic" is the weird looking background bokeh that every scene seems to have.
It looks like a cheap gaussian blur filter applied in post production and adds to the fakeness. This is probably due to the optics in some of the latest Red digital cameras, but it's still not convincing. Kinda like 48 vs 24 FPS.
I will never accept the hate for 48 FPS. 24 FPS looks like garbage. Panning shots are a stuttery mess. 24 FPS looks somewhat okay when the camera is still and only the people in it are moving, but the ridiculous choppiness ruins any moment with camera movement.
A higher refresh rate is simply an objective improvement. I don't care that you associate it with bad soap operas.
This trend started when the first video-capable full-frame DSLRs appeared on the scene about a decade ago, and they led to a mixing of the visual language of photography and filmmaking. Combined with the fact that camera lens developers are working hard to remove flaws from their products, making their picture more and more perfect, but also removing any character a lens can have.
The classic Super35 frame size is more akin to APS-C in digital photography, whereas the new top tier cameras use full frame sensors, which are 1.5x larger in diameter.
The larger the sensor, the more bokeh you get, with a shallower depth of field.
Modern tech for focus pulling allows for extra shallow depth of field even if the actors and the camera are moving. Something that would have been mostly out of focus footage 30 years ago can be easily shot with a small crew.
Large aperture photography lenses and large sensor also means much better low-light capabilities. Which means you can shoot with less studio lights, in a much lighter setup with less crew, less planning etc.
Blurry backgrounds also allow for cheaper sets.
Which means the cheaper the production budget is, the more likely they'll go with shallow depth-of-field shots.
And we the viewers will inevitably connect the cheapness of production with these visual cues.
In contrast, when an older movie had shallow depth of field scenes, those were expensive shots that had to be carefully planned, and painstakingly executed with "flawed" optics which gave them a lot of character.
One aspect of this I’ve talked about before is the phenomenon of the “permanent golden hour”: what used to be a narrow sliver of time when you could get the most dramatic lighting is now easily replicated in post. So as a result tons of movies have all their outdoor shots set in this incredibly unnatural-looking light that really pulls me out of the movie. Like I remember the banker in Switzerland scenes in The Wolf of Wall Street, with the open window and it just looked so hilariously fake I thought Scorcese was doing a bit. Nope that’s just how outside scenes/shots are lit now.
I've said it and I say it again, most or some of the reasons why it looks weird to us (people who grow up with a TV) come down to camera quality.
Most TV camera setups were about lighting and the needed amount.
Since a few years cameras are so good, we have almost real blacks and real whites sometimes even in the same frame and its naturally filmed and does not look like total shit. We have crazy aperture sizes and we use them and there is the often coitized gaussian/blur backgrounds. We have very sharp images even with very bad lightning conditions so some movies are very dark.
I think like the younger ones are already very used to it and wonder about "how odd old films look".
Back in the 80s we had films that were in color. These days, films are all in blue and orange. The blue and orange looks as bad as the 1930s two-strip technicolor, where the palette was green and red.
Case in point: House of the Dragon. The beautiful sets and costumes are all hard to enjoy when it's all drab blue and orange. Take a look at the picture:
That late 80's film is set up at daylight in a brightly lit joint. So that cameras could actually record it. The other scene is in a dimly lit dining room with directly visible light sourcee (that don't saturate). You can bet that 80's film could not have captured this kind of low-light high-contrast scene.
But another important factor was that movies couldn't be too dark because tvs couldn't reproduce them properly.
We all now have devices that can display the full SDR and HDR is becoming more common.
Filmmakers were forced to shoot and deliver bright images. So now the pendulum has gone to the other side. It's probably going to stabilize at some point.
It's hinted at here, but film produces a much better overall look than digital. It is just that digital is way cheaper and easier to work with.
There is certainly more versatility in what you can create in post production with digital but the base look of what is captured is more beautiful on film, and this is the same with photographs. Look at high quality old images captured on film, they blow most digital images out of the water. Much higher dynamic range and depth of colors and contrast.
This is why Tarantino, Nolan, and P.T. Anderson still only use film. I think this is most glaring with the mid to low budget range of films that can't make up some of the ground with production techniques and post. Old 80's and 90's mid range films on film look so much better especially in night scenes.
I'm highly skeptical of the technical claims you're making here.
Quality photography paper has about 10 stops of dynamic range (which is why the zone system for composition and printing goes up to X). While the negatives themselves may have slightly more dynamic range, there's also the clarkvision page[1] indicating they, in fact, have less dynamic range. This makes sense: you have less area to work with, and need to get something done with bigger grains because of light and time constraints.
You may well think it looks better, but more information it captures not. More noise, yes. Bleeding artifacts you like, sure. More information, no.
Tarantino uses film because he likes the artifacts film gives, not because it's objectively better. Same for PT Anderson. Indeed, both directors favor grungy 1970s aesthetics, which should give you an indication where they're coming from, stylistically. Nolan typically uses large film gauges like 65mm which are not comparable to 'regular' 35mm film.
Additionally, film does not have more dynamic range than digital. Take the latest sensors used by the Arri Alexa cameras for instance, they have a range of 17 stops. This is better than regular 35mm film. [1]
Digital is vastly superior to film for photography. I shoot both, sometimes side by side. What differs is that film is more pleasing to the eye without significant post processing, digital requires it.
But in both cases, if you get your photo in the viewfinder before you get in the camera, digital always comes out better in the end. There’s simply so much more information captured, and you can in fact get more stops of DR with digital than with film.
Edit: To add to the above, I use a light meter and check DR in post processing. Film is usually good for 8-9 stops of DR (depending on film type) and is better in overexposed scenes with bright lights. Digital generally gets 12-14 stops of DR and handles underexposed scenes better at constant ISO.
Another factor is that in dark scenes due to finer grains, digital is capable of running higher ISO in producing recoverably usable photos with denoise filters (Nik’s Tools). On my current rig, ISO 3200 is usable, on film typically ISO800 is the max I would run, there are a few good ISO 1200 films though, but they are black and white. ISO isn’t directly analogous between film and digital, but it’s close enough for the purpose of this comparison.
None of this is true, nor does it need to be. As my photography professor says, “digital is better, but I don’t want better.”
Roma is as beautiful as anything shot on film, Amelie from 1995 is as teal-orange medium-shot photo filter as anything on Netflix. The Crown is often beautiful because of impressive practical lighting effects.
[EDIT: Amelie is from 2001, I mixed up the date with the same director’s City of Lost Children from 1995]
Film has become vastly inferior to digital both in stills and cinema if we're talking about technical aspects like dynamic range and colour accuracy.
The problem is, just like with CGI which has gotten pretty good at modelling light transport, it feels off because it's too clean and clear. Digital it's too good at capturing reality. Film introduces a bunch of artefacts and limitations that 'look' nicer than accurately captured reality. In part this might be because most of us grew up watching cinema styled that way, in part because it leaves more room for the brain to compensate for those limitations with imagination.
With the advent of easily available stylisation AI trained on film, we should see digital close that gap any time now.
> film produces a much better overall look than digital.
no, not anymore, that hasn't been the case since about 2010. (no ignore RED cameras, they are terrible at colour) All the (big) films you've seen since about 2007 and all the films since 2012 have been graded digitally (apart from maybe a mike lynch film) This means that the colours are not because its film, but because of a choice made by the Director of Photography, or producer or director. The thing you think is a filmic look is mostly digital. Yes, that includes films that were shot on film, like the Noland batmans.
Yes, that means even the film you saw at the cinema on a film projector was graded digitally. it was just lasered out to film and printed for distribution. Also, the other bit to remember is that new prints look much nicer than old prints. Old prints look like shite, dirty, scratched and sometimes washed out.
"real" film shot now is nothing like the grainy stuff of 25 years ago, it has a far higher dynamic range than then, and much smaller film grain. Its just that digital film cameras are now even better and more consistent. This crossover happened in about 2012.
Those 80's and 90's night scenes aren't better because they're shot on film. They look better because they're better LIT.
Digital is easier to work with in low light some productions will try and save money by cutting back there. Can't really do that with film w/o it being painfully obvious.
This is utterly false and bordering religious beliefs.
Steve Yedlin has dedicated decades of his life to ultimately prove [1] that digital and film can achieve the same exact look, save for the fact that digital is infinitely easier to deal with.
I'm pretty sure at this point high end digital cinema cameras can straight up outperform film in all but the most extreme cases (like giant IMAX negatives maybe) but that's also a much more recent development than most people would think.
More freedom in post allows more freedom to do things audience may or may not like but I don't think it's technical limitations holding things back anymore.
Similarly I'm a bit dubious on the LED lighting comment by the anonymous redditor.
Yes LED color rendering used to be pretty horrible but these days it really doesn't have to be. all the major LED manufactures have high end white parts for color critical applications and even color mixing is actually good now with the advent of phosphor converted broad spectrum colored LEDs.
All this stuff increases the available creative space but at the same time it might not be easier to find the good stuff in a larger space
There is some small truth to this - at least for image acquisition. The bayer filters in modern digital sensors are way to broadband, which perhaps counterintuitively reduces saturation, but more importantly reduces color accuracy from human perspective. This is especially apparent in the green-red spectrum.
The sensitivity curves of Kodachrome 64 and presumably 35mm film stock much more closely match CIE 1931 effective bandpass curves and results in much more accurate color.
However, most this can easily be fixed in post with modern tools. Almost all of what you are seeing today is artifacts of the Director/cinematographer/editor's artistic choices and not the acquisition tech.
Soderbergh has shot his last few films on iPhone and they look fantastic. Lighting and framing while shooting and good editing and grading afterwards are far more important factors.
Vignette is a useful and appealing effect. That doesn’t mean a lens with vignette is superior. The images straight out of the camera may be more appealing. Yet on the other hand, with a low vignette lens, you can add exactly as much or little as you want in post.
Film vs digital seems very similar, with a greater degree of “natural processing” happening in film.
> film produces a much better overall look than digital
To add one more data point to all the other sibling comments, Vince Gilligan (a self-described huge fan of film) talked on one of the Better Call Saul podcasts about how they did a blind test, and the consensus was that digital looked better. This was around five years ago.
Star Trek The Next Generation was famously shot on film. The HD scans are so good that the props can be seen to be fake in many places. Later franchise shows that were shot in digital simply cannot be released in HD as there is no HD source material.
> This is why Tarantino, Nolan, and P.T. Anderson still only use film.
Or maybe they are just old and incapable of adapting anymore.
There is a saying: "Physics advances one funeral at a time". Meaning that you need to get rid of the old-timers which are preventing the field from advancing.
Wasn't that the whole point of the movie? How horrifying the uncanny valley can be. Not just M3gan herself but the entire world that the movie takes place in. James Wan is a brilliant director.
I haven't seen this movie, but I could see that. There are definitely some movies that use this quality as a part of their setting. See: Gunpowder Milkshake.
Personally it's poor writing & bad camera work that bugs me more than anything. Sure this digital style is apparent, but I guess it doesn't fundamentally ruin a movie for me (whereas the other two factors do).
> Everyone is lit perfectly and filmed digitally on raw and tweaked to perfection. It makes everything have a fake feeling to it.
It wasn’t until I started watching “Stuntmen React” videos on Corridor’s channel recently that I realized people are even still doing stunts. The footage in modern movies is so massaged and strangely well lit that it feels fake to the point that it might as well be the CG as I’d assumed it was.
You’re not getting the visual benefit to justify injuring these poor stunt people when it doesn’t even feel realistic at the end. Needs some grit, some imperfections. The things that made Jackie Chan’s early work so amazing.
They just don’t feel like they’re really happening, so there is no weight to it.
> You’re not getting the benefit of injuring these poor stuntmen
As someone who knows a lot of stunt people, I'd just like to point out that the benefit is not in injuring them, it's in the effortless action scenes. But yes, a byproduct of the action are the injuries.
We really need to credit stunt people better. They're the most likely to actually lose their lives during filming. Hence why some are pushing for a stunt category in the Oscars: https://movieweb.com/best-stunt-oscars-needed-why/
I was watching through the Marvel movies just to catch up. Mostly I was surfing on my phone at the same time, as they're not that engaging. At one point during Winter Soldier though I caught myself having put down the phone and was sitting on the edge of my sofa, fully engaged in the action scene with an elevated heart rate.
Some time later I saw the "Stuntmen React" where they went through the scene, and it suddenly all made sense: that whole scene, the chase/fight on the overpass, was almost entirely practical stunts.
With all the special effects and video postprocessing in photos and films today, it's no surprise they look unnatural. I think we're seeing in film and TV the same phenom that's already big in photography and music — digitally filtered and synthesized media feels unnatural. Well duh. It is.
Video and audio are being compressed to exclude noise and 'purify' the primary signal to make the product pop (something that's also making TV ads feel very unnatural today). But this isn't new of course. For decades, postproduction and direction of film and TV has been simplified and its primary signals boosted to make more impact. The cost, IMHO, is that acting has taken a back seat to visuals and background audio, creating the zombified chesspiece actors of today, which are also a big part of this “uncanny valley” of all forms of video. The result is expressive subtleties present in our remaining great actors (who are now age 60+) like Judy Dench or Daniel Day-Lewis are absent in nearly all actors under age 40, unless they got their start in stage or outside American media, where the practice of overproduction has not been as rampant.
I think what we're seeing is simply the viral spread of special effects, digitization, and overproduction into all of media, with video simply suffering its effects last.
This is a great discussion and a great starting point for interesting discussions.
It’s quite interesting that Japan anime are adding more movie-like look and feel by adding complex scenes & backgrounds (see, e.g., a Dec 2022 anime series, Chainsaw man), whereas Netflix goes the other way around. It’s said that animes are trying to gain more mainstream recognition in overseas markets this way, although it’s adversarial to the relatively low budget in an anime as well as the way they are created.
I think Netflix vs traditional movie has something do do with expectations on the screen size to watch, TV vs in cinema. It’s about information density. There are media specifically tuned to be watched on smaller screens, i.e., phones.
I used to advocate simplicity in artmaking, largely influenced by East Asian art style and also tech advertising and interface design at the early 2010 era. But as I grow up, I started to appreciate complexity, texture, imperfection especially after I moved to NYC. This has become so much so that a empty-esque room makes me uncomfortable, not organic, not homecoming. Wonder if this has something to do with age and audience also.
The Netflix are tuned to the younger generation vs traditional movie tuned to the older. And there is also that Netflix tuning to overseas markets that are unfamiliar to American audiences.
There is much to be discussed under this topic, but here are some sporadic thoughts of mine.
That’s an interesting point about phones being the viewing space, not TVs which so many people commenting seem to be assuming. That would obviously affect the choices used on production and it wasn’t addresses in the substack post that I recall.
I want to talk about the blandness of the acting in Netflix shows. As part of the young-adultification of everything, they tend to have this po-faced life-and-death fake gravitas. They're full of scenes like "a vampire coven takes a vote on whether to excommunicate, kill, and devour one of their own because she fell in love with a human". But the characters seem like children or young teenagers, despite being played by, and even representing, adults. It's like watching a precocious, somewhat psychotic 10-year-old making her Barbies plot to murder each other.
Contrast that with, say, Beverly Hills, 90210, which aged remarkably well for a 30+ year old series. It too gets rather self-serious at times, but the acting talent was really good and well-directed and they play their parts with a sort of relaxed, open California swagger that makes them seem honest and likable. 90210 has some interesting cinematic choices as well: the colors appear saturated but both the video and audio are soft and fuzzy, for what I guess is sort of a wistful look that seems less like how things actually happened and more like how the main cast's older selves remember it happening.
Some exceptions exist, of course. Stranger Things is incredibly well acted and goes for 80s verisimilitude, right down to authentic Tandy props and making the bedrooms look like they came out of E.T. with the lighting and set decoration. I Am Not Okay With This has a real honest feeling to it as well. They shot on location in Brownsville, PA, and really tried to capture the feel of the town in the background.
I decided not to watch their Sandman adaptation despite generally positive reviews because the unmistakable and garing Netflix aesthetic of the trailers soured me on the prospect. It's like they ruined a graphic novel with an AI style transfer to "in the style of Netlix".
At first I thought this was about how Netflix arranges movies for you to select, where each movie repeats down a horizontal line and you are forever scrolling right in cardinally equivalent but combinatorially different sets of movies. :D
I don't really care about technical specs of cameras or film or whatever, but I think the author is largely just wrong. At least on their comparison between Moonshot and When Harry Met Sally. Those are just different films. The Moonshot still is CLEARLY using color with great intention. It's not a realistic scene. It's a carefully designed shot. The characters, the coffee, and the table are yellow. The background is green. That's interesting. It makes the scene feel very small and intimate. The Harry diner scene is cluttered and chaotic. You have a bunch of background characters that add to the scene by looking at the main characters. You get a lot from this still. That's also interesting.
> The Moonshot still is CLEARLY using color with great intention. It's not a realistic scene. It's a carefully designed shot. The characters, the coffee, and the table are yellow. The background is green. That's interesting.
It's just lazy. There's nothing interesting about it. Because every movie has a digital color pass applied now, they feel obligated to do something with the color, but they have no idea what, so they just do the same thing over and over.
We made a circle 'black and white' -> 'color' -> 'blue and orange'.
Modern movies are too dark and lacking colors badly. All those to cheapen production I suppose.
> But every time I watch something made before 2000, it looks so beautiful to me—not otherworldly or majestic, but beautiful in the way the world around me is beautiful.
Counterpoint. Refn’s “Copenhagen Cowboy” - which was recently released on Netflix - is a visually stunning masterpiece of Neo Noir aesthetic.
However you feel might about Refn’s work - he is kind of polarizing - bland would be the last word you would use to describe it.
So I think the problem is less about formats and technology mr more about artistic vision, skill, and execution. There is a lot more content being produced now, and thus a dilution of top talent.
This week I read an article in Portuguese about how the Brazilian largest TV is having difficulties on creating new soap operas.
On social media I saw one interesting argument: the current generation that is producing the shows grew up watching GoT, and lost touch with how soap operas used to look. Until recently, Brazil used to sell soap operas to other countries ("The Clone" was a bit hit). By creating content that mimicked USA's tv shows, they lost the uniqueness of its content.
Maybe something like that is happening on Netflix as well? People watched GoT/CSI while in college and now we have blandless everywhere.
[+] [-] kzrdude|3 years ago|reply
There's less people in the background as extras, less people making background scenes or setting up. It can mostly filled in by production technology if necessary.
Some movies just seem so empty, and they look "cheap" if you see it in this angle. Some big hollywood production like for example Passengers where there are barely more than three actors visible in the whole film. I've noticed there's a lot of these hollow/empty productions with very few actual actors visible, and they are rather lifeless.
[+] [-] orbital-decay|3 years ago|reply
It's just the industry converged onto the same recipe book - some of that is cost cutting, some is just massive amounts of A/B testing throughout the years.
Demand leads to optimization, which in turn leads to convergence and less varied supply. Which is kind of paradoxical, considering the low barrier and creative freedom available with today's tools. And yes, it really started to happen somewhere in early 2000s, not just with movies and the looks but with everything related to visual entertainment, from games to illustrations.
[+] [-] gigaparsec|3 years ago|reply
It looks like a cheap gaussian blur filter applied in post production and adds to the fakeness. This is probably due to the optics in some of the latest Red digital cameras, but it's still not convincing. Kinda like 48 vs 24 FPS.
[+] [-] mort96|3 years ago|reply
A higher refresh rate is simply an objective improvement. I don't care that you associate it with bad soap operas.
[+] [-] bayjorix|3 years ago|reply
The classic Super35 frame size is more akin to APS-C in digital photography, whereas the new top tier cameras use full frame sensors, which are 1.5x larger in diameter.
The larger the sensor, the more bokeh you get, with a shallower depth of field.
Modern tech for focus pulling allows for extra shallow depth of field even if the actors and the camera are moving. Something that would have been mostly out of focus footage 30 years ago can be easily shot with a small crew.
Large aperture photography lenses and large sensor also means much better low-light capabilities. Which means you can shoot with less studio lights, in a much lighter setup with less crew, less planning etc.
Blurry backgrounds also allow for cheaper sets.
Which means the cheaper the production budget is, the more likely they'll go with shallow depth-of-field shots.
And we the viewers will inevitably connect the cheapness of production with these visual cues.
In contrast, when an older movie had shallow depth of field scenes, those were expensive shots that had to be carefully planned, and painstakingly executed with "flawed" optics which gave them a lot of character.
[+] [-] DoneWithAllThat|3 years ago|reply
[+] [-] entropie|3 years ago|reply
Most TV camera setups were about lighting and the needed amount.
Since a few years cameras are so good, we have almost real blacks and real whites sometimes even in the same frame and its naturally filmed and does not look like total shit. We have crazy aperture sizes and we use them and there is the often coitized gaussian/blur backgrounds. We have very sharp images even with very bad lightning conditions so some movies are very dark.
I think like the younger ones are already very used to it and wonder about "how odd old films look".
[+] [-] WalterBright|3 years ago|reply
Case in point: House of the Dragon. The beautiful sets and costumes are all hard to enjoy when it's all drab blue and orange. Take a look at the picture:
https://www.theguardian.com/tv-and-radio/2022/sep/15/house-o...
This happens in movie after movie after movie.
[+] [-] tuetuopay|3 years ago|reply
That late 80's film is set up at daylight in a brightly lit joint. So that cameras could actually record it. The other scene is in a dimly lit dining room with directly visible light sourcee (that don't saturate). You can bet that 80's film could not have captured this kind of low-light high-contrast scene.
[+] [-] tridao|3 years ago|reply
[+] [-] pier25|3 years ago|reply
But another important factor was that movies couldn't be too dark because tvs couldn't reproduce them properly.
We all now have devices that can display the full SDR and HDR is becoming more common.
Filmmakers were forced to shoot and deliver bright images. So now the pendulum has gone to the other side. It's probably going to stabilize at some point.
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] PKop|3 years ago|reply
There is certainly more versatility in what you can create in post production with digital but the base look of what is captured is more beautiful on film, and this is the same with photographs. Look at high quality old images captured on film, they blow most digital images out of the water. Much higher dynamic range and depth of colors and contrast.
This is why Tarantino, Nolan, and P.T. Anderson still only use film. I think this is most glaring with the mid to low budget range of films that can't make up some of the ground with production techniques and post. Old 80's and 90's mid range films on film look so much better especially in night scenes.
[+] [-] kqr|3 years ago|reply
Quality photography paper has about 10 stops of dynamic range (which is why the zone system for composition and printing goes up to X). While the negatives themselves may have slightly more dynamic range, there's also the clarkvision page[1] indicating they, in fact, have less dynamic range. This makes sense: you have less area to work with, and need to get something done with bigger grains because of light and time constraints.
You may well think it looks better, but more information it captures not. More noise, yes. Bleeding artifacts you like, sure. More information, no.
[1]: https://clarkvision.com/articles/dynamicrange2/
[+] [-] royjacobs|3 years ago|reply
Additionally, film does not have more dynamic range than digital. Take the latest sensors used by the Arri Alexa cameras for instance, they have a range of 17 stops. This is better than regular 35mm film. [1]
[1] https://www.arri.com/en/camera-systems/cameras/alexa-35#2730...
[+] [-] tristor|3 years ago|reply
But in both cases, if you get your photo in the viewfinder before you get in the camera, digital always comes out better in the end. There’s simply so much more information captured, and you can in fact get more stops of DR with digital than with film.
Edit: To add to the above, I use a light meter and check DR in post processing. Film is usually good for 8-9 stops of DR (depending on film type) and is better in overexposed scenes with bright lights. Digital generally gets 12-14 stops of DR and handles underexposed scenes better at constant ISO.
Another factor is that in dark scenes due to finer grains, digital is capable of running higher ISO in producing recoverably usable photos with denoise filters (Nik’s Tools). On my current rig, ISO 3200 is usable, on film typically ISO800 is the max I would run, there are a few good ISO 1200 films though, but they are black and white. ISO isn’t directly analogous between film and digital, but it’s close enough for the purpose of this comparison.
[+] [-] antiterra|3 years ago|reply
Roma is as beautiful as anything shot on film, Amelie from 1995 is as teal-orange medium-shot photo filter as anything on Netflix. The Crown is often beautiful because of impressive practical lighting effects.
[EDIT: Amelie is from 2001, I mixed up the date with the same director’s City of Lost Children from 1995]
[+] [-] masurus|3 years ago|reply
The problem is, just like with CGI which has gotten pretty good at modelling light transport, it feels off because it's too clean and clear. Digital it's too good at capturing reality. Film introduces a bunch of artefacts and limitations that 'look' nicer than accurately captured reality. In part this might be because most of us grew up watching cinema styled that way, in part because it leaves more room for the brain to compensate for those limitations with imagination.
With the advent of easily available stylisation AI trained on film, we should see digital close that gap any time now.
[+] [-] KaiserPro|3 years ago|reply
https://www.imdb.com/title/tt11271038/fullcredits/?ref_=tt_c... it might be shot on film, but its graded digitally. So the argument doesn't really apply here. The colours you see here are a choice made in the DI stage.
> film produces a much better overall look than digital.
no, not anymore, that hasn't been the case since about 2010. (no ignore RED cameras, they are terrible at colour) All the (big) films you've seen since about 2007 and all the films since 2012 have been graded digitally (apart from maybe a mike lynch film) This means that the colours are not because its film, but because of a choice made by the Director of Photography, or producer or director. The thing you think is a filmic look is mostly digital. Yes, that includes films that were shot on film, like the Noland batmans.
Yes, that means even the film you saw at the cinema on a film projector was graded digitally. it was just lasered out to film and printed for distribution. Also, the other bit to remember is that new prints look much nicer than old prints. Old prints look like shite, dirty, scratched and sometimes washed out.
"real" film shot now is nothing like the grainy stuff of 25 years ago, it has a far higher dynamic range than then, and much smaller film grain. Its just that digital film cameras are now even better and more consistent. This crossover happened in about 2012.
[+] [-] eschneider|3 years ago|reply
Digital is easier to work with in low light some productions will try and save money by cutting back there. Can't really do that with film w/o it being painfully obvious.
[+] [-] AstixAndBelix|3 years ago|reply
Steve Yedlin has dedicated decades of his life to ultimately prove [1] that digital and film can achieve the same exact look, save for the fact that digital is infinitely easier to deal with.
[1] https://yedlin.net/DisplayPrepDemo/index.html
[+] [-] candiddevmike|3 years ago|reply
[+] [-] iconosynclast|3 years ago|reply
More freedom in post allows more freedom to do things audience may or may not like but I don't think it's technical limitations holding things back anymore.
Similarly I'm a bit dubious on the LED lighting comment by the anonymous redditor. Yes LED color rendering used to be pretty horrible but these days it really doesn't have to be. all the major LED manufactures have high end white parts for color critical applications and even color mixing is actually good now with the advent of phosphor converted broad spectrum colored LEDs.
All this stuff increases the available creative space but at the same time it might not be easier to find the good stuff in a larger space
[+] [-] kloch|3 years ago|reply
The sensitivity curves of Kodachrome 64 and presumably 35mm film stock much more closely match CIE 1931 effective bandpass curves and results in much more accurate color.
However, most this can easily be fixed in post with modern tools. Almost all of what you are seeing today is artifacts of the Director/cinematographer/editor's artistic choices and not the acquisition tech.
[+] [-] atdrummond|3 years ago|reply
[+] [-] ip26|3 years ago|reply
Film vs digital seems very similar, with a greater degree of “natural processing” happening in film.
[+] [-] dmit|3 years ago|reply
To add one more data point to all the other sibling comments, Vince Gilligan (a self-described huge fan of film) talked on one of the Better Call Saul podcasts about how they did a blind test, and the consensus was that digital looked better. This was around five years ago.
[+] [-] dotancohen|3 years ago|reply
[+] [-] 323|3 years ago|reply
Or maybe they are just old and incapable of adapting anymore.
There is a saying: "Physics advances one funeral at a time". Meaning that you need to get rid of the old-timers which are preventing the field from advancing.
[+] [-] mise_en_place|3 years ago|reply
[+] [-] Clent|3 years ago|reply
Why should a movie about near future technology tie itself to scenes one recognizes today?
This edges on shaking fist at cloud.
[+] [-] KaiserPro|3 years ago|reply
[+] [-] Antrikshy|3 years ago|reply
[+] [-] ducharmdev|3 years ago|reply
[+] [-] donatj|3 years ago|reply
It wasn’t until I started watching “Stuntmen React” videos on Corridor’s channel recently that I realized people are even still doing stunts. The footage in modern movies is so massaged and strangely well lit that it feels fake to the point that it might as well be the CG as I’d assumed it was.
You’re not getting the visual benefit to justify injuring these poor stunt people when it doesn’t even feel realistic at the end. Needs some grit, some imperfections. The things that made Jackie Chan’s early work so amazing.
They just don’t feel like they’re really happening, so there is no weight to it.
[+] [-] faitswulff|3 years ago|reply
As someone who knows a lot of stunt people, I'd just like to point out that the benefit is not in injuring them, it's in the effortless action scenes. But yes, a byproduct of the action are the injuries.
We really need to credit stunt people better. They're the most likely to actually lose their lives during filming. Hence why some are pushing for a stunt category in the Oscars: https://movieweb.com/best-stunt-oscars-needed-why/
[+] [-] magicalhippo|3 years ago|reply
Some time later I saw the "Stuntmen React" where they went through the scene, and it suddenly all made sense: that whole scene, the chase/fight on the overpass, was almost entirely practical stunts.
[+] [-] randcraw|3 years ago|reply
Video and audio are being compressed to exclude noise and 'purify' the primary signal to make the product pop (something that's also making TV ads feel very unnatural today). But this isn't new of course. For decades, postproduction and direction of film and TV has been simplified and its primary signals boosted to make more impact. The cost, IMHO, is that acting has taken a back seat to visuals and background audio, creating the zombified chesspiece actors of today, which are also a big part of this “uncanny valley” of all forms of video. The result is expressive subtleties present in our remaining great actors (who are now age 60+) like Judy Dench or Daniel Day-Lewis are absent in nearly all actors under age 40, unless they got their start in stage or outside American media, where the practice of overproduction has not been as rampant.
I think what we're seeing is simply the viral spread of special effects, digitization, and overproduction into all of media, with video simply suffering its effects last.
[+] [-] chazeon|3 years ago|reply
It’s quite interesting that Japan anime are adding more movie-like look and feel by adding complex scenes & backgrounds (see, e.g., a Dec 2022 anime series, Chainsaw man), whereas Netflix goes the other way around. It’s said that animes are trying to gain more mainstream recognition in overseas markets this way, although it’s adversarial to the relatively low budget in an anime as well as the way they are created.
I think Netflix vs traditional movie has something do do with expectations on the screen size to watch, TV vs in cinema. It’s about information density. There are media specifically tuned to be watched on smaller screens, i.e., phones.
I used to advocate simplicity in artmaking, largely influenced by East Asian art style and also tech advertising and interface design at the early 2010 era. But as I grow up, I started to appreciate complexity, texture, imperfection especially after I moved to NYC. This has become so much so that a empty-esque room makes me uncomfortable, not organic, not homecoming. Wonder if this has something to do with age and audience also.
The Netflix are tuned to the younger generation vs traditional movie tuned to the older. And there is also that Netflix tuning to overseas markets that are unfamiliar to American audiences.
There is much to be discussed under this topic, but here are some sporadic thoughts of mine.
[+] [-] spacemadness|3 years ago|reply
It reminds me of David Lynch telling people to stop using their phones to watch films: https://www.youtube.com/watch?v=wKiIroiCvZ0
[+] [-] bitwize|3 years ago|reply
Contrast that with, say, Beverly Hills, 90210, which aged remarkably well for a 30+ year old series. It too gets rather self-serious at times, but the acting talent was really good and well-directed and they play their parts with a sort of relaxed, open California swagger that makes them seem honest and likable. 90210 has some interesting cinematic choices as well: the colors appear saturated but both the video and audio are soft and fuzzy, for what I guess is sort of a wistful look that seems less like how things actually happened and more like how the main cast's older selves remember it happening.
Some exceptions exist, of course. Stranger Things is incredibly well acted and goes for 80s verisimilitude, right down to authentic Tandy props and making the bedrooms look like they came out of E.T. with the lighting and set decoration. I Am Not Okay With This has a real honest feeling to it as well. They shot on location in Brownsville, PA, and really tried to capture the feel of the town in the background.
[+] [-] sho_hn|3 years ago|reply
[+] [-] why-el|3 years ago|reply
Things you might like:
A, B, C, D
GREAT MOVIES:
B, C, A, D
[+] [-] spywaregorilla|3 years ago|reply
I have not seen any of these films
[+] [-] munificent|3 years ago|reply
I would be much more inclined to agree with that if every fucking movie in the past twenty years wasn't also using the exact same gold-cyan two-color palette: https://tvtropes.org/pmwiki/pmwiki.php/Main/OrangeBlueContra...
It's just lazy. There's nothing interesting about it. Because every movie has a digital color pass applied now, they feel obligated to do something with the color, but they have no idea what, so they just do the same thing over and over.
[+] [-] betaby|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] smitty1e|3 years ago|reply
For a contemporary example that underscores this point, check out "All Creatures Great and Small" https://www.pbs.org/wgbh/masterpiece/shows/all-creatures-gre...
[+] [-] pfisherman|3 years ago|reply
However you feel might about Refn’s work - he is kind of polarizing - bland would be the last word you would use to describe it.
So I think the problem is less about formats and technology mr more about artistic vision, skill, and execution. There is a lot more content being produced now, and thus a dilution of top talent.
[+] [-] frozenlettuce|3 years ago|reply
On social media I saw one interesting argument: the current generation that is producing the shows grew up watching GoT, and lost touch with how soap operas used to look. Until recently, Brazil used to sell soap operas to other countries ("The Clone" was a bit hit). By creating content that mimicked USA's tv shows, they lost the uniqueness of its content. Maybe something like that is happening on Netflix as well? People watched GoT/CSI while in college and now we have blandless everywhere.
Google translated article: https://translate.google.com/translate?sl=auto&tl=en&hl=pt-B...