Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The 'Toy Story' You Remember (animationobsessive.substack.com)
1098 points by ani_obsessive 22 hours ago | hide | past | favorite | 316 comments




This topic is fascinating to me. The Toy Story film workflow is a perfect illustration of intentional compensation: artists pushed greens in the digital master because 35 mm film would darken and desaturate them. The aim was never neon greens on screen, it was colour calibration for a later step. Only later, when digital masters were reused without the film stage, did those compensating choices start to look like creative ones.

I run into this same failure mode often. We introduce purposeful scaffolding in the workflow that isn’t meant to stand alone, but exists solely to ensure the final output behaves as intended. Months later, someone is pitching how we should “lean into the bold saturated greens,” not realising the topic only exists because we specifically wanted neutral greens in the final output. The scaffold becomes the building.

In our work this kind of nuance isn’t optional, it is the project. If we lose track of which decisions are compensations and which are targets, outcomes drift badly and quietly, and everything built after is optimised for the wrong goal.

I’d genuinely value advice on preventing this. Is there a good name or framework for this pattern? Something concise that distinguishes a process artefact from product intent, and helps teams course-correct early without sounding like a semantics debate?


I worked at DreamWorks Animation on the pipeline, lighting and animation tools for almost ten years. All of this information is captured in our pipeline process tools, although I am sure there are edits and modifications that are done that escape documentation. We were able to pull complete shows out of deep storage, render scenes using the toolchain the produced them and produce the same output. If the renders weren't reproducable, madness would ensue.

Even with complete attention to detail, the final renders would be color graded using Flame, or Inferno, or some other tool and all of those edits would also be stored and reproducible in the pipeline.

Pixar must have a very similar system and maybe a Pixar engineer can comment. My somewhat educated assumption is that these DVD releases were created outside of the Pixar toolchain by grabbing some version of a render that was never intended as a direct to digital release. This may have happened as a result of ignorance, indifference, a lack of a proper budget or some other extenuating circumstance. It isn't likely John Lasseter or some other Pixar creative really wanted the final output to look like this.


Amazing. Your final point seems to make most sense - not the original team itself having any problems.

There’s an analog analogue: mixing and mastering audio recordings for the devices of the era.

I first heard about this when reading an article or book about Jimi Hendrix making choices based on what the output sounded like on AM radio. Contrast that with the contemporary recordings of The Beatles, in which George Martin was oriented toward what sounded best in the studio and home hi-fi (which was pretty amazing if you could afford decent German and Japanese components).

Even today, after digital transfers and remasters and high-end speakers and headphones, Hendrix’s late 60s studio recordings don’t hold a candle anything the Beatles did from Revolver on.


And now we have the Loudness War where the songs are so highly compressed that there is no dynamic range. Because of this, I have to reduce the volume so it isn't painful to listen to. And this makes what should have been a live recording with interesting sound into background noise. Example:

https://www.youtube.com/watch?v=3Gmex_4hreQ

If you want a recent-ish album to listen to that has good sound, try Daft Punk's Random Access Memories (which won the Best Engineered Album Grammy award in 2014). Or anything engineered by Alan Parsons (he's in this list many times)

https://en.wikipedia.org/wiki/Grammy_Award_for_Best_Engineer...


> now

Is this still a problem? Your example video is from nearly twenty years ago, RAM is over a decade old. I think the advent of streaming (and perhaps lessons learned) have made this less of a problem. I can't remember hearing any recent examples (but I also don't listen to a lot of music that might be victim to the practice); the Wikipedia article lacks any examples from the last decade https://en.wikipedia.org/wiki/Loudness_war

Thankfully there have been some remasters that have undone the damage. Three Cheers for Sweet Revenge and Absolution come to mind.


Certified Audio Engineer here. The Loudness Wars more or less ended over the last decade or so due to music streaming services using loudness normalization (they effectively measure what each recording's true average volume is and adjust them all up or down on an invisible volume knob to have the same average)

Because of this it generally makes more sense these days to just make your music have an appropriate dynamic range for the content/intended usage. Some stuff still gets slammed with compression/limiters, but it's mostly club music from what I can tell.


This goes along with what I saw growing up. You had the retail mastering (with RIAA curve for LP, etc.) and then the separate radio edit which had the compression that the stations wanted - so they sounded louder and wouldn't have too much bass/treble. And also wouldn't distort on the leased line to the transmitter site.

And of course it would have all the dirty words removed or changed. Like Steve Miller Band's "funky kicks going down in the city" in Jet Airliner

I still don't know if the compression in the Loudness War was because of esthetics, or because of the studios wanting to save money and only pay for the radio edit. Possibly both - reduced production costs and not having to pay big-name engineers. "My sister's cousin has this plug-in for his laptop and all you do is click a button"...


> I still don't know if the compression in the Loudness War was because of esthetics,

Upping the gain increases the relative "oomph" of the bass at the cost of some treble, right?

As a 90s kid with a bumping system in my Honda, I can confidently say we were all about that bass long before Megan Trainor came around. Everyone had the CD they used to demo their system.

Because of that, I think the loudness wars were driven by consumer tastes more than people will admit (because then we'd have to admit we all had poor taste). Young people really loved music with way too much bass. I remember my mom (a talented musician) complaining that my taste in music was all bass.

Of course, hip hop and rap in the 90s were really bass heavy, but so was a lot of rock music. RHCP, Korn, Limp Bizkit, and Slipknot come to my mind as 90s rock bands that had tons of bass in their music.

Freak on a Leash in particular is a song that I feel like doesn't "translate" well to modern sound system setups. Listening to it on a setup with a massive subwoofer just hits different.


> Korn

It wasn't the bass, but rather the guitar.

The bass player tuned the strings down a full step to be quite loose, and turned the treble up which gave it this really clicky tone that sounded like a bunch of tictacs being thrown down an empty concrete stairwell.

He wanted it to be percussive to cut through the monster lows of the guitar.


Music, as tracked by Billboard, cross genre, is as loud as ever. Here’s a survey of Billboard music:

https://www.izotope.com/en/learn/mastering-trends?srsltid=Af...

I have an Audio Developer Conference talk about this topic if you care to follow the history of it. I have softened my stance a bit on the criticism of the 90’s (yeah, people were using lookahead limiting over exuberantly because of its newness) but the meat of the talk may be of interest anyway.

https://www.youtube.com/watch?v=0Hj7PYid_tE


That makes sense, thanks for the reply!

It's still a problem, although less consistently a problem than it used to be for the reason entropicdrifter explained.

There's a crowdsourced database of dynamic range metrics for music at:

https://dr.loudness-war.info/

You can see some 2025 releases are good but many are still loudness war victims. Even though streaming services normalize loudness, dynamic range compression will make music sound better on phone speakers, so there's still reason to do it.

IMO, music production peaked in the 80s, when essentially every mainstream release sounded good.


I was obsessed with Tales of Mystery & Imagination, I Robot, and Pyramids in the 70s. I also loved Rush, Yes, ELP, Genesis, and ELO, but while Alan Parsons' albums weren't better in an absolute musical sense, his production values were so obviously in a class of their own I still put Parsons in the same bucket as people like Trevor Horn and Quincy Jones, people who created masterpieces of record album engineering and production.

A voice on the radio sounded better with vibrato, so that’s what they did before even recordings were made. Same when violins played.

These versions were for radio only and thought of as cheap when done in person.

Later this was recorded, and being the only versions recorded, later generations thought that this is how the masters of the time did things, when really they would be booed off stage (so to speak).

It’s a bit of family history that passed this info on due to being multiple generations of playing the violin.


Interesting!

I've noticed this with lots of jazz from the 50s and 60s. Sounds amazing in mono but "lacking" in stereo.

That’s more due to mono being the dominant format at the time so the majority of time and money went to working on the mono mix. The stereo one was often an afterthought until stereo became more widespread and demand for good stereo mixes increased.

Because it's mono?

> There’s an analog analogue: mixing and mastering audio recordings for the devices of the era.

In the modern day, this has one extremely noticeable effect: audio releases used to assume that you were going to play your music on a big, expensive stereo system, and they tried to create the illusion of the different members of the band standing in different places.

But today you listen to music on headphones, and it's very weird to have, for example, the bassline playing in one ear while the rest of the music plays in your other ear.


That's with a naive stereo split. Many would still put the bass on one side, with the binaural processing so it's still heard on the right, but quieter and with a tiny delay.

Hard panning isn't naive. It's just a choice that presumes an audio playback environment.

If you're listening in a room with two speakers, having widely panned sounds and limited use of reverb sounds great. The room will mix the two speakers somewhat together and add a sense of space. The result sounds like a couple of instruments playing in a room, which is sort of is.

But if you're listening with a tiny speaker directly next to each ear canal, then all of that mixing and creating a sense of space must be baked into the two audio channels themselves. You have to be more judicious with panning to avoid creating an effect that couldn't possibly be heard in a real space and add some more reverb to create a spatial environment.


Maybe I'm misunderstanding him but I think he says the music track can have hard panning, and it's the headphone playback system that should do some compensatory processing so that it sounds as if it was played on two speakers in a room.

Don't ask me how it works but I know gaming headsets try to emulate a surround setup.


Yes, these sorts of compensation features have become common on higher end headphones.

One example:

> The crossfeed feature is great for classic tracks with hard-panned mixes. It takes instruments concentrated on one channel and balances them out, creating a much more natural listening experience — like hearing the track on a full stereo system.

https://us.sennheiser-hearing.com/products/hdb-630


No, they just didn't put much time into stereo because it was new and most listeners didn't have that format. So they'd hard pan things for the novelty effect. This paradigm was over by the early 70s and they gave stereo mixes a more intentional treatment.

> decent German and Japanese components

Whoa there! Audio components were about the only thing the British still excelled at by that time.


I wasn't aware of home hi-fi but British gear for musicians was widespread when I was growing up (Marshall, Vox, etc).

I was specifically thinking of the components my father got through the Army PX in the 60s and the hi-fi gear I would see at some friends' houses in the decades that followed ... sometimes tech that never really took hold, such as reel-to-reel audio. Most of it was Japanese, and sometimes German.

I still have a pair of his 1967 Sansui speakers in the basement (one with a blown woofer, unfortunately) and a working Yamaha natural sound receiver sitting next to my desk from about a decade later.


Wharfedale (1920s) and Cambridge Audio (1960s) were there, and are still making great home hifi.

British music of the 60s and 70s was pretty great to listen to on that hifi.

The same with movie sound mixing, where directors like Nolan are infamous for muffling dialogue in home setups because he wants the sound mixed for large, IMAX scale theater setups.

I've always been a fan of repos that I come across with ARCHITECTURE.md files in them, but that's a pretty loose framework and some just describe the what and not the why.

Otherwise, I wish I worked at a place like Oxide that does RFDs. https://rfd.shared.oxide.computer Just a single place with artifacts of a formal process for writing shit down.

In your example, writing down "The greens are oversaturated by X% because we will lose a lot of it in the transfer process to film" goes a long way in at least making people aware of the decision and why it was made, at least then the "hey actually the boosted greens look kinda nice" can prompt a "yeah but we only did that because of the medium we were shipping on, it's wrong"


You're assuming people RTFM, which does not happen at all in my case. Documentation exists for you to link to when someone already lost days on something finally reaches out.

Culture changes under the impact of technology, but culture also changes when people deliberately teach practices.

(Cough) Abstraction and separation of concerns.

In Toy Story's case, the digital master should have had "correct" colors, and the tweaking done in the transfer to film step. It's the responsibility of the transfer process to make sure that the colors are right.

Now, counter arguments could be that the animators needed to work with awareness of how film changes things; or that animators (in the hand-painted era) always had to adjust colors slightly.

---

I think the real issue is that Disney should know enough to tweak the colors of the digital releases to match what the artists intended.


Production methodolgies for animated films have progressed massively since 1995 and Pixar may have not found the ideal process for the color grading of the digital to film step. Heck, they may not have color graded at all! This has been suggested. I agree that someone should know better than to just take a render and push it out as a digital release without paying attention to the result.

> In Toy Story's case, the digital master should have had "correct" colors

Could it be the case that generating each digital master required thousands of render hours?


That's an invalid argument: Digitally tweaking color when printing film has nothing to do with how long it takes to render 3d.

They had a custom built film printer and could make adjustments there.


But the compensation for film should be a cheap 2-D color filter pass, not an expensive 3-D renering pass.

I know you're looking for something more universal, but in modern video workflows you'd apply a chain of color transformations on top the final composited image to compensate the display you're working with.

So I guess try separating your compensations from the original work and create a workflow that automatically applies them


Theory: Everything is built on barely functioning ruins with each successive generation or layer mostly unaware of the proper ways to use anything produced previously. Ten steps forward and nine steps back. All progress has always been like this.

I’ve come to similar conclusions, and further realized that if you feel there’s a moment to catch your breath and finally have everything tidy and organized, possibly early sign of stagnation or decline in an area. Growth/progress is almost always urgent and overwhelming in the moment.

Do you have some concrete or specific examples of intentional compensation or purposeful scaffolding in mind (outside the topic of the article)?

Not scaffolding in the same way, but, two examples of "fetishizing accidental properties of physical artworks that the original artists might have considered undesirable degradations" are

- the fashion for unpainted marble statues and architecture

- the aesthetic of running film slightly too fast in the projector (or slightly too slow in the camera) for an old-timey effect


Isn’t the frame rate of film something like that?

The industry decided on 24 FPS as something of an average of the multiple existing company standards and it was fast enough to provide smooth motion, avoid flicker, and not use too much film ($$$).

Overtime it became “the film look”. One hundred-ish years later we still record TV shows and movies in it that we want to look “good” as opposed to “fake” like a soap opera.

And it’s all happenstance. The movie industry could’ve moved to something higher at any point other than inertia. With TV being 60i it would have made plenty of sense to go to 30p for film to allow them to show it on TV better once that became a thing.

But by then it was enshrined.


Great examples. My mind jumps straight to audio:

- the pops and hiss of analog vinyl records, deliberately added by digital hip-hop artists

- electric guitar distortion pedals designed to mimic the sound of overheated tube amps or speaker cones torn from being blown out


- Audio compression was/is necessary to get good SNR on mag tape.

true - but are you implying audio engineers are now leaning into heavy compression for artistic reasons?

Not necessarily heavy (except sometimes as an effect), but some compression almost all the time for artistic reasons, yes.

Most people would barely notice it as it's waaaay more subtle than your distorted guitar example. But it's there.

Part of the likeable sound of albums made on tape is the particular combination of old-time compressors used to make sure enough level gets to the tape, plus the way tape compresses the signal again on recording by it's nature.


Motion blur. 24fps. Grain. Practically everything we call cinematic

I wouldn't call it "fetishizing" though; not all of them anyway.

Motion blur happens with real vision, so anything without blur would look odd. There's cinematic exaggeration, of course.

24 FPS is indeed entirely artificial, but I wouldn't call it a fetish: if you've grown with 24 FPS movies, a higher frame rate will paradoxically look artificial! It's not a snobby thing, maybe it's an "uncanny valley" thing? To me higher frame rates (as in how The Hobbit was released) make the actors look fake, almost like automatons or puppets. I know it makes no objective sense, but at the same time it's not a fetishization. I also cannot get used to it, it doesn't go away as I get immersed in the movie (it doesn't help that The Hobbit is trash, of course, but that's a tangent).

Grain, I'd argue, is the true fetish. There's no grain in real life (unless you have a visual impairment). You forget fast about the lack of grain if you're immersed in the movie. I like grain, but it's 100% an esthetic preference, i.e. a fetish.


>Motion blur happens with real vision, so anything without blur would look odd.

You watch the video with your eyes so it's not possible to get "odd"-looking lack of blur. There's no need to add extra motion blur on top of the naturally occurring blur.


In principle, I agree.

In practice, I think the kind of blur that happens when you're looking at a physical object vs an object projected on a crisp, lit screen, with postprocessing/color grading/light meant for the screen, is different. I'm also not sure whatever is captured by a camera looks the same in motion than what you see with your eyes; in effect even the best camera is always introducing a distortion, so it has to be corrected somehow. The camera is "faking" movement, it's just that it's more convincing than a simple cartoon as a sequence of static drawings. (Note I'm speaking from intuition, I'm not making a formal claim!).

That's why (IMO) you don't need "motion blur" effects for live theater, but you do for cinema and TV shows: real physical objects and people vs whatever exists on a flat surface that emits light.


I suspect 24fps is popular because it forces the videography to be more intentional with motion. Too blurry, and it becomes incomprehensible. That, and everything staying sharp at 60fps makes it look like TikTok slop.

24fps looks a little different on a real film projector than on nearly all home screens, too. There's a little time between each frame when a full-frame black is projected (the light is blocked, that is) as the film advances (else you'd get a horrid and probably nausea-inducing smear as the film moved). This (oddly enough!) has the effect of apparently smoothing motion—though "motion smoothing" settings on e.g. modern TVs don't match that effect, unfortunately, but looks like something else entirely (which one may or may not find intolerably awful).

Some of your fancier, brighter (because you lose some apparent brightness by cutting the light for fractions of a second) home digital projectors can convincingly mimic the effect, but otherwise, you'll never quite get things like 24fps panning judder down to imperceptible levels, like a real film projector can.


> (which one may or may not find intolerably awful).

"Motion smoothing" on TVs is the first thing I disable, I really hate it.


Me at every AirBnB: turn on TV "OH MY GOD WTF MY EYES ARE BLEEDING where is the settings button?" go turn off noise reduction, upscaling, motion smoothing.

I think I've seen like one out of a couple dozen where the motion smoothing was already off.


I think the "real" problem is not matching shutter speed to frame rate. With 24fps you have to make a strong choice - either the shutter speed is 1/24s or 1/48s, or any panning movement is going to look like absolute garbage. But, with 60+fps, even if your shutter speed is incredible fast, motion will still look decent, because there's enough frames being shown that the motion isn't jerky - it looks unnatural, just harder to put your finger on why (whereas 24fps at 1/1000s looks unnatural for obvious reasons - the entire picture jerks when you're panning).

The solution is 60fps at 1/60s. Panning looks pretty natural again, as does most other motion, and you get clarity for fast-moving objects. You can play around with different framerates, but imo anything more than 1/120s (180 degree shutter in film speak) will start severely degrading the watch experience.

I've been doing a good bit of filming of cars at autocross and road course circuits the past two years, and I've received a number of compliments on the smoothness and clarity of the footage - "how does that video out of your dslr [note: it's a Lumix G9 mirrorless] look so good" is a common one. The answer is 60fps, 1/60s shutter, and lots of in-body and in-lens stabilization so my by-hand tracking shots aren't wildly swinging around. At 24/25/30fps everything either degrades into a blurry mess, or is too choppy to be enjoyable, but at 60fps and 1/500s or 1/1000s, it looks like a (crappy) video game.


Is getting something like this wrong why e.g. The Hobbit looked so damn weird? I didn't have a strong opinion on higher FPS films, and was even kinda excited about it, until I watched that in theaters. Not only did it have (to me, just a tiny bit of) the oft-complained-about "soap opera" effect due to the association of higher frame rates with cheap shot-on-video content—the main problem was that any time a character was moving it felt wrong, like a manually-cranked silent film playing back at inconsistent speeds. Often it looked like characters were moving at speed-walking rates when their affect and gait were calm and casual. Totally bizarre and ruined any amount of enjoyment I may have gotten out of it (other quality issues aside). That's not something I've noticed in other higher FPS content (the "soap opera" effect, yes; things looking subtly sped-up or slowed-down, no).

[EDIT] I mean, IIRC that was 48fps, not 60, so you'd think they'd get the shutter timing right, but man, something was wrong with it.


Another example: pixel art in games.

Now, don't get me wrong, I'm a fan of pixel art and retro games.

But this reminds me of when people complained that the latest Monkey Island didn't use pixel art, and Ron Gilbert had to explain the original "The Curse of Monkey Island" wasn't "a pixel art game" either, it was a "state of the art game (for that time)", and it was never his intention to make retro games.

Many classic games had pixel art by accident; it was the most feasible technology at the time.


I work in vfx, and we had a lecture from one of the art designers that worked with some formula 1 teams on the color design for cars. It was really interesting on how much work goes into making the car look "iconic" but also highlight sponsors, etc.

But for your point, back during the pal/ntsc analog days, the physical color of the cars was set so when viewed on analog broadcast, the color would be correct (very similar to film scanning).

He worked for a different team but brought in a small piece of ferrari bodywork and it was more of a day-glo red-orange than the delicious red we all think of with ferrari.


Isn't the entire point of "reinventing the wheel" to address this exact problem?

This is one of the tradeoffs of maintaining backwards compatibility and stewardship -- you are required to keep track of each "cause" of that backwards compatibility. And since the number of "causes" can quickly become enumerable, that's usually what prompts people to reinvent the wheel.

And when I say reinvent the wheel, I am NOT describing what is effectively a software port. I am talking about going back to ground zero, and building the framework from the ground up, considering ONLY the needs of the task at hand. It's the most effective way to prune these needless requirements.


enumerable -> innumerable

(opposite meaning)


Thanks, you are right. Wish I could edit it.

> (opposite meaning)

Funnily enough, e- means "out" (more fundamentally "from") and in- means "in(to)", so that's not an unexpected way to form opposite words.

But in this case, innumerable begins with a different in- meaning "not". (Compare inhabit or immiserate, though.)


Yeah, English has so many quirks. As a software dev, the "enum" type cane to mind, making this one easier to spot. (shrug)

> Yeah, English has so many quirks.

Arguably true in general, but in this specific case everything I said was already true in Latin.


In some projects I work on I've added a WHY.md at the root that explains what's scaffolding and what's load bearing, essentially. I can't say it's been effective at preventing the problem you outlined, but at least it's cathartic.

It seems pretty common in software - engineers not following the spec. Another thing that happens is the pivot. You realize the scaffolding is what everyone wants and sell that instead. The scaffold becomes the building and also product.

"Cargo cult"? As in, "Looks like the genius artists at Pixar made everything extra green, so let's continue doing this, since it's surely genius."

That’s a great observation. I’m hitting the same thing… yesterday’s hacks are today’s gospel.

My solution is decision documents. I write down the business problem, background on how we got here, my recommended solution, alternative solutions with discussion about their relative strengths and weaknesses, and finally and executive summary that states the whole affirmative recommendation in half a page.

Then I send that doc to the business owners to review and critique. I meet with them and chase down ground truth. Yes it works like this NOW but what SHOULD it be?

We iterate until everyone is excited about the revision, then we implement.


There are two observations I've seen in practice with decision documents: the first is that people want to consume the bare minimum before getting started, so such docs have to be very carefully written to surface the most important decision(s) early, or otherwise call them out for quick access. This often gets lost as word count grows and becomes a metric.

The second is that excitement typically falls with each iteration, even while everyone agrees that each is better than the previous. Excitement follows more strongly from newness than rightness.


Eventually you'll run into a decision that was made for one set of reasons but succeeded for completely different reasons. A decision document can't help there; it can only tell you why the decision was made.

That is the nature of evolutionary processes and it's the reason people (and animals; you can find plenty of work on e.g. "superstition in chickens") are reluctant to change working systems.


Chesterton’s Fence is a related notion.

Aha! I used to work in film and was very close to the film scanning system.

When you scan in a film you need to dust bust it, and generally clean it up (because there are physical scars on the film from going through the projector. Theres also a shit tone of dust, that needs to be physically or digitally removed, ie "busted")

Ideally you'd use a non-real time scanner like this: https://www.filmlight.ltd.uk/products/northlight/overview_nl... which will collect both colour and infrared. This can help automate dust and scratch removal.

If you're unluckly you'll use a telecine machine, https://www.ebay.co.uk/itm/283479247780 which runs much faster, but has less time to dustbust and properly register the film (so it'll warp more)

However! that doesnt affect the colour. Those colour changes are deliberate and are a result of grading. Ie, a colourist has gone through and made changes to make each scene feel more effective. Ideally they'd alter the colour for emotion, but that depends on who's making the decision.

the mechanics are written out here: https://www.secretbatcave.co.uk/film/digital-intermediary/


How much of the colour change is also dependent on the film printer and also film scanner/telecine?

It just seems like there’s a lot of variability in each step to end up with an unintended colour, that will taken as the artist’s intent.


> is also dependent on the film printer

The printers deffo make a difference to colour, but I came from VFX world where we put a macbeth chart in for each shot so we could adjust the colour afterwards. (https://en.wikipedia.org/wiki/ColorChecker) We'd have a whole team working on making sure that colour was accurate.

The scanners we used (northlight) were calibrated a lot so my understanding is that if you scanned the same film twice it was meant to be pixel perfect (We did rescans for various reasons and it supposedly matched up enough to do effects work. but that might have been proxies, ie low resolution scans that were done for speed)

Also the printers should, if they are good match it properly, thats what you're paying them for. I know that we did have a person that calibrated film projectors for colour, but I never asked them _how_ they did it.

For toy story its a bit harder because you are digitising a whole finished movie, you don't have the colour chart in every shot to keep the colour consistent. I know for adverts the telecine people did loads of fiddling to make the colour consistent, but I assumed that was because the spirit 4k was a bit shit.

I never dealt with actual finished prints, because the colourist/DI people sent the finished graded off to someone like Deluxe to print out


Thank you! I would love to know the process of the colour guy. Like how much of it is the same as a chef tasting a dish and realizing it needs x ingredient in the spot.

Any digitizing is done before color grading.

All steps before try to not affect the color and keep as much dynamic range as possible to give as much leeway as possible for the colorist.

Realistically, for Pixar and Disney (not people with limitwd funds, say), the color grade is much much more relevant to the final color than the specifics of digitizing.


Doesn't wet scanning "automatically" get rid of the dust and scratches issue?

> However! that doesnt affect the colour.

That has been something I've wondered about since seeing frame comparisons of (probably) telecine'ed prints of The Matrix vs. the myriad home video releases.


I'm a colorist and it absolutely does effect color. Every telecine is different and will create a different looking scan. Telecine operators will do a one light pass to try and compensate but any scan needs to be adjusted to achieve what the artist's original vision was.

> Every telecine is different and will create a different looking scan.

I mean they should be calibrated, so they have a different feel, but they shouldn't be wildly different like the screen shots.

I know the spirit operators did magic, but they were in the advertising team, and I was in film so I was never allowed to visit the sexy telecine room.


I was going to mention the Noodle video on that

https://www.youtube.com/watch?v=lPU-kXEhSgk

TL;DW, different physical rolls of film sent to different movie theaters can have slightly different coloring, if they were done by different people or different companies or even if someone just did their job differently that day. Film color was not an exact science and not always perfectly repeatable, and depended on chemistry.


How did you dust bust it? Wipe it by hand with a microfiber cloth or something?

In optics & film usually blowing air is employed, as wiping runs the risk of further scratches in the case of an abrasive particle (e.g. sand)

There are handheld tools (google hand blower bulb), but I would imagine film scanning uses something less manual


> Theres also a shit tone of dust, that needs to be physically or digitally removed, ie "busted"

Is that because you're just leaving the film out in a big pile, or because it decays rapidly?

I would have expected film to be stored in containers.


It's normally stored in sealed boxes, but every time its taken out to be projected then the whole reel is unspooled and exposed to the environment. Most projectors are pretty good at not blowing unfiltered air on the film, but there is a surprising amount of dust that is just floating in the air.

The room that the scanners used to be in were temperature and dust controlled, everyone was supposed to wear dust jackets when you enter.


Someone correct me if I'm wrong, but I believe it builds a static charge as it runs through the projector and attracts dust. I say this because I remember holding my hand near moving film in our (home) movie projector, and as a kid enjoying feeling the hairs on my arm standing up from the static. Maybe professional gear protects against that somehow, but if not that'd be why.

Yep, most film (also photo) attracts dust like a magnet. Kodak made a Static Eliminator to mitigate that with high voltage to an extent:

https://mcnygenealogy.com/book/kodak/static-eliminator.pdf


Film is only 35 mm or 70 mm large, and the picture itself is slightly smaller than that. Even a tiny amount of dust adds a lot of noise.

There's a similar issue with retro video games and emulators: the screens on the original devices often had low color saturation, so the RGB data in those games were very saturated to compensate. Then people took the ROMs to use in emulators with modern screens, and the colors are over-saturated or just off. That's why you often see screenshots of retro games with ridiculously bright colors. Thankfully now many emulators implement filters to reproduce colors closer to the original look.

Some examples:

https://www.reddit.com/r/Gameboy/comments/bvqaec/why_and_how...

https://www.youtube.com/watch?v=yA-aQMUXKPM


With the GBA, the original GBA screen and the first gen GBA SP had very washed out colors and not saturated at all. The Mario ports to the GBA looked doubly since they desaturated their colors and were shown on a desaturated screen. I've heard that the real reason the colors were desaturated was because the first GBA model didn't have a backlight so the colors were lightened to be more visible, but I'm not quite sure that's the case. Lots of other games didn't do that.

And with the second version of the GBA SP and the GB Micro, colors were very saturated. Particularly on the SP. If anything, cranking up the saturation on an emulator would get you closer to how things looked on those models, while heavily desaturating would get you closer to the look on earlier models.


> With the GBA, the original GBA screen and the first gen GBA SP had very washed out colors and not saturated at all. The Mario ports to the GBA looked doubly since they desaturated their colors and were shown on a desaturated screen. I've heard that the real reason the colors were desaturated was because the first GBA model didn't have a backlight so the colors were lightened to be more visible,

That's certainly the case. The super low screen brightness of the first GBA was a major problem, because you often literally couldn't see things properly under less than perfect ambient light. So compensating for low brightness was more important than compensating for low color saturation, which is merely an aesthetic issue.


The most egregious example is old CGA games that were written to work on composite monitors. Without the composite display, they appear monochrome or cyan and magenta.

https://user-images.githubusercontent.com/7229541/215890834-...

It blew my mind when I finally learnt this, as I spent years of my childhood playing games that looked like the examples on the left, not realising the colours were due to the RGB monitor I had.


Oh you’re blowing my mind right now, played lots of CGA games with neon colours as a kid. What did they look like on a composite monitor?

Also, are you able to tell me the name of the game in the second row in that screenshot?


Ah yes, we often get folks in the nesdev community bickering over which "NES Palette" (sourced from their favorite emulator) is the "best" one. The reality is extraordinarily complicated and I'm barely qualified to explain it:

https://www.nesdev.org/wiki/PPU_palettes#2C02

In addition to CRTs having variable properties, it turns out a lot of consoles (understandably!) cheat a little bit when generating a composite signal. The PPU's voltages are slightly out of spec, its timing is weird to work around a color artifact issue, and it generates a square wave for the chroma carrier rather than an ideal sine wave, which produces even more fun problems near the edges. So we've got all of that going on, and then the varying properties of how each TV chooses to interpret the signal. Then we throw electrons at phosphors and the pesky real world and human perception gets involved... it's a real mess!


Final Fantasy Tactics on Game Boy Advance had a color mode for television.

https://www.youtube.com/shorts/F29nlIz_tWo


Nice, and two LCD modes to adapt to different GBA screens! (presumably the GBA and GBA SP first model, vs the GBA SP second model with backlight)

https://www.youtube.com/watch?v=2sxKJeYSBmI

This video is related to that issue


This is shockingly almost exactly the same conversation that goes on in the retrogaming community regarding pre-HD consoles. "The artists meant for this art to be viewed on a CRT".

The challenge is that everybody's memory is different, and sometime those memories are "I wish the graphics were rock sharp without the artifacts of the CRT". Other times our memories are of the crappy TV we were given as kids that was on its last legs and went black & white and flickered a lot.

The reality is that no matter what the intentions of the original animation teams were, the pipeline of artwork through film transfer to projection to reflection to the viewer's own eyeballs and brain has enough variety to it that it's simply too variable -- and too personal -- the really say what is correct.

Anecdote: one of the local theaters I grew up with was extremely poorly maintained, had a patched rip on one of the several dirty screens, and had projectors that would barely get through an hour of film without needing a "bump" from the projectionist (allowing the audience to go out and get more refreshments halfway through most films). No amount of intentionality by the production companies of the many films I saw there could have accounted for any of that. But I saw many of my favorite movies there.

I've come down with the opinion that these things are like wine. A good wine is the one you enjoy. I have preferences for these things, but they sometimes change, and other people are allowed to enjoy things in their own way.


I have always felt certain media just looks better (to my own personal tastes) on VHS and on CRTs. I know that technically it isn't the highest possible definition or quality and that both have significant drawbacks in terms of fidelity or whatever. But my taste likes what it likes. Just like how some people think vinyl records can sound more appealing and warmer than the equivalent digital media, even though the digital version has a higher bitrate and other advantages.

I do in fact still have Toy Story on VHS and recently watched a bit of it with my toddler. And while I'm sure the Blu-ray or streamed version is higher resolution, wide screen, and otherwise carries more overall video and audio data than our tape I personally got a bit of extra joy out of watching the tape version on our old TV.

I never considered the color differences pointed out in the article here, and I'm not sure how they appear on home VHS vs on 35mm. Maybe that is a small part of what makes the tape more appealing to me although I don't think it's the full reason. Some feelings are difficult to put into words. Tapes on a full aspect ratio CRT just give me a certain feeling or have a specific character that I still love to this day.


Most 8/16 bit consoles I like the "clean RGB into a much better CRT than most people owned at the time" scanline look. NES has gotta be dirty composite into an average consumer CRT though otherwise it just aint right (and I didn't even grow up with a NES).

It's a surprisingly common error where someone picks up an old 35mm print and assumes it is somehow canonical... Besides whatever the provenance of these prints are (this gets complicated) the reality is that these were also made to look at best as they could for typical movie theater projector systems in the 90s. These bulbs were hot and bright and there were many other considerations around what the final picture would look like on the screen. So yeah, if you digitize 35mm film today, it will look different, and different from how its ever been been displayed in a movie theater.

Agreed. It's a fine article but leaves half the story on the table. It is supposedly comparing what these movies looked like in the theater to the modern streaming and bluray versions, but is actually comparing what a film scan (scanner settings unspecified) projected on a TV (or other unspecified screen) looks like compared to the digital versions on (presumably) the same screen. And then we can ask: how were the comparison images captured, rendered to jpeg for the web, before we the readers view them on our own screens? I'm not arguing Catmull and company didn't do a great job of rendering to film, but this comparison doesn't necessarily tell us anything.

Don't believe me? Download the comparison pictures in the article to your device and play with filters and settings. You can get almost anything you want and the same was true at every step in the render pipeline to your TV.

Ps - and don't get me started on how my 60-year old eyes see color to what they perceived when I saw this in the theater


It’s an interesting and valid point that the projectors from the time would mean current scans of 35mm will be different too. However, taking for example the Aladdin screenshot in particular, the sky is COMPLETELY the wrong colour in the modern digital edition, so it seems to me at least that these 35mm scans whilst not perfect to the 90’s are closer to correct than their digital counterparts.

And as someone who is part of those conservation communities that scan 35mm with donations to keep the existing look, a lot of the people doing those projects are aware of this. They do some color adjustment to compensate for print fading, for the type of bulb that were used in movie theatres back then (using a LUT), etc...

I do find that often enough commercial releases like Aladdin or other movies like Terminator 2 are done lazily and have completely different colors than what was historically shown. I think part of this is the fact that studios don't necessarily recognise the importance of that legacy and don't want to spend money on it.


Whats wrong with terminator 2?

Are there like multiple digital releases, one with better colour than the other?


There's a 4K version out that does interesting things with colour grading, here's a post I found: https://www.reddit.com/r/Terminator/comments/d65pbi/terminat.... The one on the left is the remaster.

There was similar outrage (if that's the right word) about a Matrix remaster that either added or removed a green color filter, and there's several other examples where they did a Thing with colour grading / filtering in a remaster.


To me, that just looks like what happens when I try to play HDR content on a system that doesn't know about HDR. (It looks like you're watching it through sunglasses.)

I own the blu-ray of The Terminator 2, and briefly owned the 4k as well. The 4k looks like dogshit, with or without HDR enabled. This is largely due to the film using DNR to remove film grain, which they did for the 4k transfer was created for the 2017 3D release (film grain is bad in 3D I guess). The transfer is also much, much more blue.

There's multiple versions of the Matrix on the trackers and the internet that I know of. The official release all look kinda different to each other:

https://www.youtube.com/watch?v=1mhZ-13HqLQ

There's a 35mm scan floating around from a faded copy with really weird colors sometimes

https://www.youtube.com/watch?v=Ow1KDYc9XsE

And there's an Open Matte Version, which I don't know the Origin of.

https://www.youtube.com/watch?v=Z2eCmhBgsyI

For me, it's the Open Matte that I consider the ultimate best version.


See my top level comment for more info on this, but the Aladdin scan used in the article was from a 35mm trailer that's been scanned on an unknown scanner, and had unknown processing applied to it. It's not really possible to compare anything other than resolution and artefacts in the two images.

And it was made by a lab that made choices on processing and developing times, that can quite easily affect the resulting image. You hope that labs are reasonably standard across the board and calibrate frequently, but even processing two copies of the same material in a lab, one after the other will result in images that look different if projected side by side. This is why it's probably impossible to made new prints of 3-strip-cinerama films now, the knowledge and number of labs that can do this are near zero.

This reminds me of how pre-LCD console games don't look as intended on modern displays, or how vinyl sounds different from CDs because mixing and mastering targeted physical media with limitations.

Wasn't CD more so cheapening out? Doing work one time and mostly for radio where perceived listening scenario was car or background and thus less dynamic range allowed it be louder on average.

CD itself can replicate same dynamic range and more, but well that doesn't sell extra copies.


The loudness war was a thing in all media. In the 80s most of us didn't have CD players but our vinyl and tapes of pop and rock were all recorded overly loud. Compared to the classical and jazz recordings, or perhaps the heyday of audiophile 70s rock, it was annoying and sad.

> It's a surprisingly common error where someone picks up an old 35mm print and assumes it is somehow canonical

Same applies for people buying modern vinyl records believing them to be more authentic than a CD or (god-forbid) online streaming.

Everything comes from a digital master, and arguably the vinyl copy adds artefacts and colour to the sound that is not part of the original recording. Additionally, the vinyl is not catching more overtones because it's analogue, there is no true analogue path in modern music any more.


I don't know if this is still true, but I know that in the 2000s the vinyls usually were mastered better than the CDs. There even was a website comparing CD vs vinyl releases, where the person hosting it was lamenting this fact because objectively CDs have a much higher dynamic range than vinyls, although I can't find it now. CDs were a victim of the loudness war[0].

Allegedly, for a lot of music that is old enough the best version to get (if you have the kind of hifi system that can make use of it) is an early 80s CD release, because it sits in a sweet spot of predating the loudness war where producers actually using the dynamic range of the CD.

[0] https://en.wikipedia.org/wiki/Loudness_war


The loudness wars were mostly an artifact of the 90s-2010s, because consumers were listening on horrible plasticky iPod earbuds or cheap Logitech speakers and the music had to sound good on those.

Once better monitors became more commonplace, mastering became dynamic again.

This is most clear with Metallica's Death Magnetic, which is a brickwalled monstrosity on the 2008 release but was fixed on the 2015 release[0]. And you can see this all over, where albums from the 90s had a 2000s "10-year anniversary" remaster that is heavily compressed, but then a 2010s or 2020s remaster that is dynamic again.

[0] Interestingly enough between those dates, fans extracted the non-brickwalled Guitar Hero tracks and mastered them as well as they could. Fun times :).


Right, that makes sense. And it also makes sense that vinyls didn't suffer from this because the people who would buy those would use them at home with better speakers. Or that classical music CDs throughout the entire period made great use of the dynamic range, since that also is more likely to be listened to on high quality speakers.

Vinyl literally cannot be brickwalled because the needle can't handle it. That's also why vinyl can't handle heavy bass, it'll make the needle vibrate out of the groove. It has nothing to do with the speakers.

It was sort of a happy coincidence that vinyl's limitations forced more dynamic (but less bass-y) masters. Although if your artist didn't do vinyl releases -which really was a dying medium until hipsters brought it back in the 2010s- you were hosed.


I dunno about authentic but for a while (as another commenter pointed out) they didn't have the loudness maxed out and / or had better dynamic range. That said, music quality aside, vinyls have IMO better collectability value than CDs. They feel less fragile, much more space for artwork and extras, etc.

At the same time, I think the nostalgia people feel for those versions isn't necessarily about accuracy, it's about emotional fidelity

I think the entire premise of the article should be challenged. Not only is 35mm not meant to be canonical, but the 35mm scans the author presented are not what we saw, at least for Aladdin.

I've watched Aladdin more than any as a child and the Blu-ray screenshot is much more familiar to me than the 35mm scan. Aladdin always had the velvia look.

> Early home releases were based on those 35 mm versions.

Here's the 35mm scan the author presents: https://www.youtube.com/watch?v=AuhNnovKXLA

Here's the VHS: https://www.youtube.com/watch?v=dpJB7YJEjD8


Famously CRT TVs didn't show as much magenta so in the 90s home VHS releases compensated by cranking up the magenta so that it would be shown correctly on the TVs of the time. It was a documented practice at the time.

So, yes the VHS is expected to have more magenta.

Anecdotally, I remember watching Aladdin at the movie theatre when it came out and later on TV multiple times and the VHS you saw doesn't correspond to my memories at all.


The author here is asserting that VHS were based on the 35mm scans, and that the oversaturation is a digital phenomena. Clearly, that's not true.

I can't challenge the vividness of your memory. That's all in our heads. I remember it one way, and you remember it another.


For sure, the author simplified things for the article. Anyway, in the case of VHS, they were indeed based on the 35mm scan but then had additional magenta added (as well as pan and scan to change the aspect ratio).

The author is not wrong that oversaturation is a source transfer phenomena (which will always be different unless special care is taken to compare with the source material).

On most TVs that magenta wouldn't have shown as much as the youtube video shows because TVs tended to have weaker magentas. Of course, it's not like TVs were that uniformly calibrated back then and there were variations between TVs. So depending on the TV you had, it might have ended up having too much magenta but that would have usually been with more expensive and more accurate TVs.

TLDR: Transfers are hard, any link in the chain can be not properly calibrated, historically some people in charge of transferring from one source to another compensated for perceived weak links in the chain.


The magenta thing is interesting. I learned something new. Reading the other comments, this is seems to be as much a tale of color calibration as much as anything.

Regarding my memory, it becomes shakier the more I think about it. I do remember the purples but me having watched the cartoon could have affected that.


NTSC: Never Twice the Same Color

It sounds like in the case of Toy Story, the Pixar team were working toward a 35mm print as the final product, so that probably should be considered canonical: it's what the creative team set out to make.

This makes so much more sense now. After having kids I've been watching my fair share of Pixar and I just never recalled how flat and bland everything looked but I would always chalk it up to my brain not recalling how it looked at the time. Good to know I guess that it wasn't just entirely nostalgia but sad that we continue to lose some of this history and so soon.

It's kind of sad that what felt like a defining aesthetic at the time is now basically an accidental casualty of progress

Things like this are being preserved, you just have to sail the high seas.

And it's for things like this that people just don't understand the value of a healthy alternative way to archive things.

A true fan who wants to preserve and be faithful on its scan is going to dedicate their life to get it just right, while a mega corp will just open the original, click "Export as..." and call it a day.


Yeah I clicked this link going “oh god it’s because they printed to film, I bet, and man do I hope it looks worse so I don’t have to hunt down a bunch of giant 35mm scans of even more movies that can’t be seen properly any other way”

But no, of course it looks between slightly and way better in every case. Goddamnit. Pour one out for my overworked disk array.

And here I was thinking it was just my imagination that several of these look kinda shitty on Blu-ray and stream rips. Nope, they really are worse.

Piracy: saving our childhoods one frame at a time.


When it comes to Star Wars, people are literally spotting them in Photoshop frame by frame. :)

I recently watched the despecialized edition. During the lightsaber duel between Obi-Wan Kenobi and Darth Vader, there are moments where Obi-Wan's lightsaber has no glow and looks more like a very thick metal antenna, or like he's play-dueling with a short curtain rod.

I can't figure out how to determine if that's intentional.


The ignited sabers used spinning sticks with reflective tape covering them, and some rotoscoping applied in post to get the final effect. There are spots in the un-retouched Star Wars where the effect is missed, or something, and you can indeed see the raw stick. Something similar's also behind that one weird shot where Obi Wan's saber seems to almost fizzle out (the tip is pointed too much toward the camera, and messes up the effect)

The careful eye may also notice they almost never strike the sabers against one another in that scene... because it'd break the spinning sticks. Apparent contact is usually gently done, or a trick of perspective.


I'm not sure why you're getting downvoted. What you're hinting at is that a lot of original 35mms are now getting scanned and uploaded privately, especially where all the commercial releases on Blu-ray and streaming are based on modified versions of the original movies, or over-restored versions.

These can be especially hard to find as the files are typically enormous, with low compression to keep things like grain. I see them mostly traded on short-lived gdrives and Telegram.


> I see them mostly traded on short-lived gdrives and Telegram.

Someone tell this community to share over BT. Aint nobody got time to keep up with which platform/server everyone is on and which links are expired and yuck.


The main reason they are not shared as widely is that there's a bit of conflict within the community between those that really want to stay under the radar and not risk being targeted by copyright owners (and so try to keep things very much private between the donors who funded the 600-900 usd cost of the scans) and those who want to open up a bit more and so use telegram, reddit and upload to private trackers.

I would be surprised if they didn't end up on the prestigious private trackers

> with low compression to keep things like grain.

But you have algorithmic grain in modern codecs, so no need to waste so much space for noise?


This grain looks extremely fake.

Because one is genuine physics and another is a fake crap?

the calculations and the photons sent to your eyes are all genuine physics

One’s an accurate recording of how a real thing looked.

The other’s fake noise.

One’s a real photo from 1890. The other’s an old-timey instagram filter.

It makes sense that some folks might care about the difference. Like, I love my old family Polaroids. I would not want a scanned version of those to have the “noise” removed for compression’s sake. If that had been done, I’d have limited interest in adding fake noise back to them. By far my favorite version to have would be the originals, without the “noise” smoothed out at all.

Lots of folks have similar feelings about film. Faked grain isn’t what they’re after, at all. It’s practically unrelated to what they’re looking for.


> One’s an accurate recording of how a real thing looked.

> The other’s fake noise

But since there is no such thing as the real thing, it could just as well match one of the many real noise patterns in one of the many real things floating around, or a real thing at a different point in time with more/less degradation. And you wouldn't even know the difference, thus...

> It makes sense that some folks might care about the difference

Not really, it doesn't make sense to care about identical noise you can't tell apart. Of course, plenty people care about all kind of nonsense, so that won't stop those folks, but let's not pretentd there is some 'real physics' involved


But… of course there is? A record of a real thing is different from a statistical simulation of it.

I think you missed the "a" vs " the", you can encode different sources that would have different grains, or the same source would have different grain at different times.

But also a simulation called compression of a real thing is different from that real thing, so that purity test had already been failed


I just feed AI the IMDB summary and let it re-create the movie for me. Just as “pure” as high-bitrate h.265, after all.

You've chosen your argumentative perch very well, it's indeed right down there with the AI slop where you can't see any difference in reality

Well film grain doesn't matter because compression exists, apparently, and may as well be simulated because it's already failed the "purity test" and may as well be algo-noise. That holds for everything else! May as well maximize the compression and simulate all of it then.

[EDIT] My point is "film grain's not more-real than algo noise" is simply not true, at all. An attempt to represent something with fidelity is not the same thing as giving up and faking it entirely based on a guess with zero connection to the real thing—its being a representation and not the actual-real-thing doesn't render it equally as "impure" as a noise-adding filter.


You're still dancing there in the slop, hallucinating the arguments thinking it's a pretty dress!

It may as well be stimulated because you won't see the difference! So now you've imagined some purity test which was never true, so you have nothing and start hallucinating some hyperbolic AI thing


> But also a simulation called compression of a real thing is different from that real thing, so that purity test had already been failed

Quoted: the introduction of “purity test” to the conversation, from not one of my posts.


> Akshually

One is the real deal and another one is a simulation. End of story.


You can’t trust corporations to respect or protect art. You can’t even buy or screen the original theatrical release of Star Wars. The only option is as you say. There are many more examples of the owners of IP altering it in subsequence editions/iterations. This still seems so insane to me that it’s not even for sale anywhere…

I don't understand why you're getting downvoted. So many beautiful things have been lost to perpetual IP, e.g. old games that could be easily ported by volunteers given source code, which can never be monetised again.

Sometimes people create things that surpass them, and I think it is totally fair for them to belong to humanity after the people that created them generated enough money for their efforts.


> You can’t even buy or screen the original theatrical release of Star Wars

You can actually, the 2006 Limited Edition DVD is a double disc version one being the original version.

However they are not DVD quality because they were transferred from LaserDisc and not the original film stock


Even those aren’t accurate to the 1977 film.

To pick an arguably-minor but very easy to see point: the title’s different.


Self-reply because I’m outside the edit window: the dvd “original” releases are based on the laserdisc, but supposedly they modify it to restore the pre-1981-re-release title, so I’m actually wrong!

I can’t find out if they fix the 3% speed-up from the laser disc. The audio mix, at any rate, will be a combination of the three (stereo, mono, 70mm) original mixes, like on the laser disc, so identical to none of them. The source should predate the replacement of Latin script with made-up letters (not conceived until ROTJ then retrofitted on some releases of SW and Empire) so that’ll be intact unless they “fixed” it.

Still stuck with sub-ordinary-dvd-quality picture, as far as official releases go, so that’s too bad. Oh well, fan 35mm scan projects solved that problem.


I have these DVDs and you are right. But still the closest thing to the OG theatrical version officially available.

Would be annoying, but I suppose you could also recalibrate your display to turn down the greens?

VLC has a lot of image manipulation options.

What sort of terms might one search for?

"toy story film scan" on Kagi led me to a reddit page that may or may not contain links that might help you, but don't dawdle those links may not work forever.

Another one that's been hard to find is the 4k matrix original color grading release. Ping me if you have it! (Not the 1080p release)


I'm surprised they can't just put a filter on the digital versions to achieve a similar look and feel to the 35mm version.

It is clear that the animators factored in the colour changes from the original media to 35mm, so it seems a disservice to them to re-release their works without honouring how they intended the films to be seen.


They could, but it would require some work to get it right. This is very similar to conversations that happen regularly in the retro game scene regarding CRT monitors vs modern monitors for games of a certain era. The analog process was absolutely factored in when the art was being made, so if you want a similar visuals on a modern screen you will need some level of thoughtful post processing.

Disney 100% has access to colorists and best in class colour grading software. It must have been a business (cost cutting) decision?

I’m reminded of the beginning of the movie Elf, where the book publisher is informed that a printing error means their latest book is missing the final two pages. Should they pulp and reprint? He says,

> You think a kid is going to notice two pages? All they do is look at the pictures.

I’m quite sure bean counters look at Disney kids movies the exact same way, despite them being Disney’s bread and butter.

With Star Wars you have a dedicated adult fan base that’ll buy up remasters and reworkings. Aladdin? Not so much. Especially in the streaming era, no one is even buying any individual movie any more.


I'm a 39 year old man who ground his VHS of Aladdin to dust in the 90s, and bought the Blu Ray because I can't say I can rely on streaming to always exist.

> With Star Wars you have a dedicated adult fan base that’ll buy up remasters and reworkings. Aladdin? Not so much. Especially in the streaming era, no one is even buying any individual movie any more.

I agree it was likely Disney being cheap, but there are tons of people who'll buy up disney movies on physical media in the age of streaming. Not only are there disney fans who'd rival the obsessiveness of star wars fans, but like Lucas Disney just can't leave shit alone. They go back and censor stuff all the time and you can't get the uncensored versions on their streaming platform. Aladdin is even an example where they've made changes. It's not even a new thing for Disney. The lyrics to one of the songs in Aladdin were changed long before Disney+ existed.


Steve Jobs' type attitude vs Bill Gates type attitude (in the 90s). Or, Apple vs Microsoft.

The Disney of yesterday might have been a bit more Jobs than Gates, compared to the Disney of today.


They care very deeply about this and devoted a lot of resources to (re)grading the digital versions that you see today on Disney+. The versions you see are intentional and not the result of cost cutting. (I was not directly privy to this work but I worked on Disney+ before its launch and I sat in on some tech talks and other internal information about the digital workflows that led to the final result on the small screen and there was a lot of attention on this at the time)

I think there's a discussion to be had about art, perception and devotion to the "original" or "authentic" version of something that can't be resolved completely but what I don't think is correct is the perception that this was overlooked or a mistake.


The vast majority of people will not care nor even notice. Some people will notice and say, hey, why is it "blurry." So do you spend a good chunk of time and money to make it look accurate or do you just dump the file onto the server and call it a day?

To speak nothing of the global audience for these films. I'm guessing most people's first experience seeing these movies was off a VHS or DVD, so the nostalgia factor is only relevant to small percentage of viewers, and only a small percentage of that percentage notices.

VHS resolution is total crap... yet: it's not uncommon for the colors and contrast on VHS (and some early DVD) to be much better than what is available for streaming today.

This is totally bonkers, because the VHS format is crippled, also color wise. Many modern transfers are just crap.


It’s really amazing how some Blu-ray do in fact manage to be net-worse than early dvd or even vhs, but it’s true.

An infamous case is the Buffy the Vampire Slayer tv show. The Blu-ray (edit: and streaming copies) went back to the film source, which is good, but… that meant losing the color grading and digital effects, because the final show wasn’t printed to film. Not only did they get lazy recreating the effects, they don’t seem to have done scene-by-scene color grading at all. This radically alters the color-mood of many scenes, but worse, it harms the legibility of the show, because lots of scenes were shot day-for-night and fixed in post, but now those just look like they’re daytime, so it’s often hard to tell when a scene is supposed to be taking place, which matters a lot in any show or film but kinda extra-matters in one with fucking vampires.

The result is that even a recorded-from-broadcast VHS is arguably far superior to the blu ray for its colors, which is an astounding level of failure.

(There are other problems with things like some kind of ill-advised auto-cropping seeming to have been applied and turning some wide shots into close-ups, removing context the viewer is intended to have and making scenes confusing, but the colors alone are such a failure that a poor VHS broadcast recording is still arguably better just on those grounds)


How can we get away from this mindset as a society, where craft and art are sacrificed at the altar of "it's not monetarily worth it."

There's a fucking lot of things that are not worth it monetarily, but worth it for the sake of itself. Because it's a nice gesture. Or because it just makes people happy. Not to sound like some hippie idealist, but it's just so frustrating that everything has to be commoditized.


It’s really been the driving force of modern life for centuries at this point.

Centuries is stretching it. It’s central to industrialisation, Taylor, Ford, etc. The relentless pursuit of efficiency and technique. Its anti-thesis is art for art’s sake.

In modern tech circles, the utilitarian mindset is going strong, now that the hacker ethos is dead and it’s all about being corporate friendly and hireable.


Yeah the industrialised world wasn't maligned by Blake as 'dark Satanic mills' or as Mordor by Tolkien because they found it an artistically fulfilling place.

> How can we get away from this mindset as a society, where craft and art are sacrificed at the altar of "it's not monetarily worth it."

Honestly, by weakening copyright protections. People who love the works will do the work to protect them when they don't have to fear being sued into bankruptcy for trying to preserve their own culture.


You can sit down and recolor the movie frame by frame and release it on torrent yourself, it'll make many people happy. It won't be worth it monetarily but since you're annoyed it doesn't exist and money isn't a factor...

It's always easy to complain about others not being generous enough with their time, but we always have an excuse for why we won't do it ourselves.


You can't do that since besides time you also need knowledge/skill. So the final difference could be between "an extra 1% of the budget" at a corporate level vs "and extra 10% of your life to become a professional and fix a video, and also break the law in the process". Pretty easy to see how it's not just "an excuse", but a bit more fundamental issue

I'm this particular instance though it's not really about time, it's studios not wanting to pay what I imagine would be a relatively small amount to do the conversion. It's not going to be a frame-by-frame laborious process.

> You can sit down and recolor the movie frame by frame and release it on torrent yourself, it'll make many people happy.

You can't, at least not if you want an acceptable result.

In photography, if you have a JPEG photo only, you can't do post-facto adjustments of the white balance, for that you need RAW - too much information has been lost during compression.

For movies it's just the same. To achieve something that actually looks good with a LUT (that's the fancy way for re-coloring, aka color grading), you need access to the uncompressed scans, as early in the processing pipeline as you can get (i.e. before any kind of filter is applied).


Just dialing down the red and blue channels a bit makes it much closer for several of the early '90s releases (look at that Aladdin example from TFA)

They could reduce the saturation with 1 mouse click if they wanted, but they didn't. They must have intentionally decided that high saturation is desirable.

Disney do pay for industry leading colorists. They chose to favour a more saturated look for Aladdin et al. It is reasonable to prefer either. I can't imaging what happened to the greens in the Toy Story examples if they are accurate.

And ultimately, what you need to achieve acceptable CRT effects is resolution. Only now, with 4K and above, can we start to portray the complex interactions between the electron beam and the produced image by your console. But the colour banding that caused the hearts of The Legend of Zelda to show a golden sheen is still unreachable.

Reminded me of this article about some retro games on crt vs lcd-

https://wackoid.com/game/10-pictures-that-show-why-crt-tvs-a...


You can, that's what Pixar did while creating the film. From the article:

> During production, we’re working mostly from computer monitors. We’re rarely seeing the images on film. So, we have five or six extremely high-resolution monitors that have better color and picture quality. We put those in general work areas, so people can go and see how their work looks. Then, when we record, we try to calibrate to the film stock, so the image we have on the monitor looks the same as what we’ll get on film.

But they didn't do a perfect job (the behavior of film is extremely complex), so there's a question- should the digital release reflect their intention as they were targeting these calibrated monitors or should it reflect what was actually released? Also, this wouldn't include other artifacts like film grain.


> Then, when we record, we try to calibrate to the film stock, so the image we have on the monitor

Except, as they say, the high grade monitors were calibrated to emulate the characteristics of film.

If we can show that D+ doesn't look like the film, then we can point out that it probably doesn't look like the calibrated monitors either. Those army men are not that shade of slime green in real life, and you'll have a hard time convincing me that after all the thought and effort went in to the animation they allowed that putrid pea shade to go through.


The best option for me would be to release it in whatever format preserves the most of the original colour data without modification, then let the viewer application apply colour grading. Give me the raw renders in a linear 16bpc colour space with no changes. Sadly, I don't think we have digital movie formats that can handle that.

It is doable and you can get presets designed to mimic the look of legendary photography film stock like Velvia. But what they did back then was very much an analog process and thus also inherently unstable. Small details start to matter in terms of exposure times, projectors used etc. There’s so many frames and it took so much time, that it’s almost guaranteed there’d be noticeable differences due to process fluctuations.

It's not just about slapping on some grain and calling it a day; it's about honoring a whole set of artistic decisions that were made with that specific medium in mind

It's even worse with The Matrix where nobody is even sure any more how it was supposed to look, except definitely not as green as the DVD release.

Noodle made a charming video about going mad researching this: https://www.youtube.com/watch?v=lPU-kXEhSgk


The original DVD was way, way less green than later releases, which were changed to match the more-extreme greens used in the sequels. IDK if it was as subtle as in the theater (I did see it there, but most of my watches were the first-run DVD) but it was far subtler than later DVD printings, and all but IIRC one fairly recent blu-ray that finally dialed it back to something less eye-searing and at least close-ish to the original.

The Matrix is an interesting one because it really caught on with the DVD release. So that was most peoples first exposure to it, not the theatrical release. Even if incorrect, if that was the first way you saw it, it is likely how you consider it "should" look.

It's a bit disingenuous to imply The Matrix did not catch on until DVD release. The Matrix broke several (minor) box office records, was critically hailed, and an awards darling for the below the line technical awards.

Having said all that. One of the most interesting aspects of conversations around the true version of films and such is that just because of the way time works the vast majority of people's first experience with any film will definitely NOT be a in a theater.


I thought the "green" was symbolic of the computer world because of the green code rain motif. Like how yellow = Mexico.

This is shocking to say the least.


The original has a green tint to the matrix scenes, it's just relatively subtle and blends into a general coolness to the color temp. The heightened green from later home printings is really, in-your-face green, to the point you don't really notice the coolness, just greeeeen.

The more I read about hte history of computer animation in movies the more i am amazed at George Lucas for being such a visionary.

To be the man responsible for the creation of Pixar, Industrial Light and Magic, Skywalker sound, LucasFilm games, THX, and Kerner Optical is a very impressive accomplishment for him and that's secondary to the main accomplishment he's known for in StarWars.


Light and Magic is an amazing documentary that covers lots of this history

Neat! The Youtube channel Noodle recently did a related deep dive into the differences in the releases of The Matrix [0]. The back half of the video also touches on the art of transferring from film/video to digital.

[0]: https://www.youtube.com/watch?v=lPU-kXEhSgk


I always felt the old matrix had a more colder blue. And it changed drastically when the second and third hit cinemas. At least that was my memory because I watched a double feature when the second one hit the theatre's and complained then that the Matrix somehow looked weird. But it could also be my memory since I also own the blue ray release.

Another movie with the same / similar problem is the DVD release of the Lord of the Rings Extended editions. Both Blu-ray and 4K version. As far as I remember is that they fixed it for the theatrical version in 4K but not extended.


At least they messed around with the green.

https://www.youtube.com/watch?v=XR0YBqhMtcg


That's weird. If the color difference is caused entirely by some peculiar chemical properties of the film, it should be trivial to calculate the filter that you need to apply to the digital source material so you get the "authentic" colors. So why not just do that and distribute the movie as it was designed? Or, heck, even 2 versions if you insist that some audience likes that new more vibrant look.

Yeah I also had this thought. Despite popular belief, the film look is not magic. Especially since the article mentions they already had monitors calibrated for the film look during development. Why couldn't they replicate that for the digital release? The digital release it so blazingly different, I have a hard time believing that is the best they can do with modern colour pipelines. I mean the colour grading is completely different! Surely either it was intentional or a fairly simple case of poor colour management.

The texture of the film grain makes Mulan and Aladdin really look better. The large simple filled sections look like they have so much more to them.

The one frame they showed from the Lion King really stood out. The difference in how the background animals were washed out by the sunlight makes the film version look significantly better.

I'm not sure if I'm just young enough to be on the other side of this despite seeing all three of those Disney movies as a millennial kid (Lion King and Aladdin were VHS mainstays in my house, and I remember seeing Mulan in theaters), but I honestly don't find the film grain to look better at all and think all three of those bottom images are much more appealing. For the Toy Story ones, I think I'm mostly indifferent; I can see why some people might prefer the upper film images but don't really think I'd notice which one I was watching. I'd definitely think I'd notice the difference in the 2D animation though and would find the film grain extremely distracting.

To me it's much worse. You can't see all of the detail the artists drew, and there is noise everywhere, even specs of dust.catches. Whenever I watch a film based movie my immersion always gets broken by all the little specs that show up. Digital is a much more immersive experience for me.

> To me it's much worse. You can't see all of the detail the artists drew, and there is noise everywhere, even specs of dust.catches.

In the lion king example you weren't meant to see all of the detail the artists drew. In the Army men example the color on the digital version is nothing like the color of the actual toys.

They originally made those movies the way they did intentionally because what they wanted wasn't crystal clear images with unrealistic colors, they wanted atmosphere and for things to look realistic.

Film grain and dust can be excessive and distracting. It's a good thing when artifacts added due to dirt/age gets cleaned up for transfers so we can have clear images, but the result of that clean up should still show what the artists originally intended and that's where disney's digital versions really miss the mark.


To me this just sounds like cope over the poor process of transferring digital to film. Destroying detail and color since it couldn't be accurately captured is a deviation from what the artists produced. Even if you think it looks better to you, it was mistake. They would have tried to copy it the best they can, but it's not perfect and the colors can change overtime.

This is an interesting take when you look at the gas station Toy Story example and consider the night sky. In the digital version the stars are very washed out but in the film version the sky is dark and it's easy to appreciate the stars. Perhaps it's unrealistic when you realize the setting is beneath a gas station canopy with fluorescent lights, but that detail, along with some of the very distinct coloring, stuck out to me.

> You can't see all of the detail the artists drew

That's the point in that Lion King frame, though. They drew it planning for it to get washed out by the sunlight effect, and when it's not it absolutely ruins the "vast crowd" effect they were going for because you can clearly see there's no more animals in the background and it's just 20 guys standing there.


If that were true, why didn't they draw the crowd thicker in the half that is not as washed out.

Which is of course highly subjective; you could argue that film grain is an unwanted but unavoidable side-effect from the medium used, just like other artifacts from film - vertical alignment issues, colour shifting from "film breath", 24 frames per second, or the weird sped-up look from really old films.

I don't believe these were part of the filmmaker's vision at the time, but unavoidable. Nowadays they are added again to films (and video games) on purpose to create a certain (nostalgic) effect.


I don't think this comment demonstrates an understanding of the argument. An unavoidable side-effect of the medium is part of the medium. You will consider unavoidable side-effects when you are building something for a particular medium, unless you are stupid. If that unavoidable side-effect were not part of the medium, you would have made different choices.

Colorizing a black-and-white film, for example, is not ever restoring the original intention or vision, even "subjectively." If the makers of a black-and-white film had been making a color film, they would have made different choices.

This does not mean that you should not colorize black-and-white films, you should do whatever makes you happy. I honestly can't wait until AI is recreating missing scenes or soundtracks from partially lost films, or even "re"creating entire lost films from scripts or contemporary reviews and cast lists, and expanding films to widescreen by inventing contents on the edges. But this will not be restoring a vision, this will be original work.


It does, but much more important to me is the color grading. The white point in the film versions is infinitely better.

Same is true of home video hardware:

If you plug a Nintendo system's RCA cables into a modern TV, it will look like garbage. Emulated games on LCDs look pixelated.

Those games were designed for a CRT's pixel grid. They don't look right on LCDs, and the upscalers in home theater equipment don't respect that. There are hardware upscalers and software shaders that are specifically designed to replicate a CRT's quirks, to let you better approximate how those games were designed to be played.

Related - someone recently built a CRT dock for his Switch, so he could play Nintendo Switch Online's emulated games as originally intended:

https://www.youtube.com/watch?v=wcym2tHiWT4


Film weave could also be worth mentioning.

Movies projected on film look different not only because of the color and texture, but also a constant spatial jitter over time. When the film moves through the projector, each frame locks into a slightly different position vertically. That creates a wobble that's called "film weave."

(If you want to create truly authentic-looking titles for a 1980s B-grade sci-fi movie, don't forget to add that vertical wobble to your Eurostile Extended Bold layout that reads: "THE YEAR IS 2025...")


Film weave is also the bane of the VFX world. If a shot is going to have, say, a matte painting added in post, then a pin registered camera must be used. These cameras have a precisely machined pin that centers the film stock in the gate after the pull down claw retracts. Later post processing stages also use pin registered movements, so each frame is in exactly the same place every time it's used. Otherwise, the separate elements would weave against each other and give away the effect.

Film weave is such an underrated part of that analog feel

I'm stunned so many people here can remember details as fine as the colour grading of a film. I couldn't remember specifics like that from 6 months ago, let alone 30 years ago when I was a child and wouldn't have had the thought to watch for cinematographic touches.

Side node - I wonder if it's a millenial thing that our memories are worse due to modern technology, or perhaps we are more aware of false memories due to the sheer availability of information like this blog post.


I doubt many people 'remember' this to any significant extent, but there are probably many cases of media giving the 'wrong' vibe with a new release, and you just assume it's because you've gotten older, but then when you get access to the original you experienced, the 'good' vibe is back, and you can easily compare between the two.

Although some people do infact remember the differences, but I'd guess a lot of those incidents are caused by people experiencing them in fairly quick succession. It's one thing to remember the difference between a DVD 20 years ago and a blu-ray you only watched today, and another to watch a DVD 15 years ago and a blu-ray 14 years ago.


At least for me it's not so much details like color grading over the entire film, it's more like a specific scene got burned into memory. Movie looks pretty much fine until reaching that scene and it's so different it's like something got shaken loose and you start to see the larger issues.

For an example people here might be more familiar with, it's like how you can't even see bad kerning until you learn what it is, then start to notice it a lot more.


I am not a huge gamer - maybe a dozen hours a year. But I feel that, say, Mario responds differently to controls in an emulator than how I remember Mario responding on an NES with a CRT.

But I was never very good, and it has been decades, so I don't know how much of this is just poor memory - I actually don't think I'm good enough/play enough that the latency of modern input/displays makes a difference at my level.

I would love to try both side-by-side to see if I could pick out the difference in latency/responsiveness.


I've played Mario in emulators where I could play just fine, and others where I kept falling down holes and such on friggin' level 1, which at times I could probably have beaten literally blindfolded, and had a hard time progressing at all. The latter might not (or, if extremely bad, might) be laggy enough for me to be able to tell you by looking, but plainly the feel is off enough to be a problem.

I find a good test is Punch Out!!! If it's much trouble at all for me to reach at least Great Tiger, the latency is really bad (even if I couldn't tell you just by looking). If I can get to Great Tiger without much trouble but struggle to do much damage to him before getting taken out, the latency's still unacceptably high for some games, but not totally awful.

Another good one's High Speed. If I can't land the final multi ball shot at least a decent percentage of the time (the game pauses the ball a couple times while police chatter plays, when you're set up for a multi ball, and after the last pause you can land the shot to initiate multi ball immediately and skip all the flashing-lights-and-sirens crap if you're very precise with your timing, it's like very-small number of milliseconds after the ball resumes its motion) then the latency is high enough to affect gameplay.

If I can land that shot at least 60-70% of the time, and if I can reach Bald Bull in Punch Out!!!, then probably any trouble I have in other games is my own damn fault :-)

I suppose as I age further these tests will become kinda useless for me, because my reflexes will be too shot for me to ever do well at these no matter how many hours of practice I've had over the decades :-(

Anyway, even in the best case you're always going to have worse display and input latency on a digital screen with a digital video pipeline and USB controllers than an early console hooked up over composite or component to a CRT. I've found it's low enough (even on mediocre TVs, provided they have a "game mode", and those are a ton worse than most PC monitors) for me not to mind much if the emulation itself is extremely snappy and is adding practically no extra latency, and there are no severe problems with the software side of the video & input pipelines, but otherwise... it can make some already-tough games unplayably hard in a hurry.

I do wonder about the experience of people who try these games for the first time in an emulator. They'll come to the game with no built-in way to tell if they keep slipping off ledges because the latency's like six frames instead of the ~ one it was originally, or because they just suck at it.


Different people just remember different things. I bet most people don't remember either and only going "ah yes of course!" after reading this blogpost (which means they didn't remember at all).

Anecdata here, but I played Zelda Ocarina of Time on CRT when I was a child, and have since replayed it many times via emulator. The game never looked quite as good as I remembered it, but of course I chalked it up to the fact that graphics have improved by several orders of magnitude since then.

Then a few years ago I was throwing out my parent's old CRT and decided to plug in the N64 one last time. Holy crap was it like night and day. It looked exactly as I remembered it, so much more mysterious and properly blended than it does on an LCD screen.

I don't see why the same wouldn't apply to films, sometimes our memories aren't false.


They probably can't.

There is a solution for Toy Story specifically -- the Laserdisc version. If I'm not mistaken, the laserdisc was digitized from the 35mm print.

Toy Story is the only Pixar movie ever released on Laserdisc (along with all their shorts, in the same box set). Disney also released a lot of their 90s animation on Laserdisc.

So if you're a true cinephile, seek out the Laserdisc versions.


I've grown very fond of having shaders available for my retro games.

I suspect having shader plugins for TV and movie watching will become a thing.

"The input is supposed to be 24 FPS, so please find those frames from the input signal. Use AI to try to remove compression artifacts. Regrade digital for Kodak 35mm film. Then, flash each frame twice, with blackness in-between to emulate how movie theaters would project each frame twice. Moderate denoise. Add film grain."

I don't actually know what kind of filters I'd want, but I expect some people will have very strong opinions about the best way to watch given movies. I imagine browsing settings, like browsing user-contributed Controller settings on Steam Deck...


would it be posible today playing from computer using vlc or similar + plugins ?

It certainly should be possible but idk if its actually implemented; at the very least you should be able to implement it as a filter plugin for ffmpeg.

Some of the more advanced CRT shaders actually attempt to mathematically model how the video gets distorted by the CRT and even the component video. If the effects of converting to film are so well-understood that Pixar can adapt their film for the process then it out to be able to post-process the video in a way that reproduces those artifacts.

I don't think its possible for it ever to be exactly the same since the display technology of a monitor is fundamentally different from a film projector(or a CRT) but it should be possible to get it good enough that its indistinguishable from an photo of the film being displayed on a modern monitor (ie the colors aren't completely different like in the comparisons in the article.

BTW TFA didn't mention this but about 15 years ago they rerendered toy story and toy story 2 for a new theatrical run when those gimmicky 3d glasses were popular. If that's the version thats being distributed today on Disney plus and bluray (IDK but i feel like it probably is) then that could potentially be a more significant factor in ruining the color balance than not having been converted to film.


ShaderGlass probably gets you pretty decently far down the path.

https://mausimus.itch.io/shaderglass


How well does 35mm hold up over time? Could these movies be said to “no longer exist” in some sense, if the scans have decayed noticeably?

Playing them, handling them, and poor storage all degrade them. Most original prints will have been played many times, and often haven’t been consistently stored well.

The 4k77 et c. fan scans of the original Star Wars trilogy, which aimed to get as close as possible to what one would have seen in a theater the year of release, used multiple prints to fill in e.g. bad frames, used references like (I think) magazine prints of stills and well-preserved fragments or individual frames to fix the (always faded, sometimes badly) color grading and contrast and such, and had to extensively hand-correct things like scratches, with some reels or parts of reels requiring a lot more of that kind of work than others. Even Jedi required a lot of that sort of work, and those reels would have been only something like 30-35 years old when they started working on them.


Hollywood does store original prints in underground salt mines (at least I am aware of a place in Kansas where they do this). Of course who knows where the frames we are being shown from the 35mm film version are coming from. Likely not these copies that are probably still in halite storage.

Personally, I prefer film versions in every example listed.

Is it possible to replicate the digital->film transition with tone mapping? (I assume the answer is yes, but what is the actual mapping?)

Generally yes, but we're still working on it all these years later! This article by Chris Brejon offers a very in-depth look into the differences brought about by different display transforms: https://chrisbrejon.com/articles/ocio-display-transforms-and...

The "best" right now, in my opinion, is AgX, which at this point has various "flavours" that operate slightly differently. You can find a nice comparison of OCIO configs here: https://liamcollod.xyz/picture-lab-lxm/CAlc-D8T-dragon


Wow, those links are a goldmine, thanks!

I went down the tonemapping rabbit hole for a hobby game engine project a while ago and was surprised at how complex the state-of-the-art is.


> He [David DiFrancesco] broke ground in film printing — specifically, in putting digital images on analog film.

> Their system was fairly straightforward. Every frame of Toy Story’s negative was exposed, three times, in front of a CRT screen that displayed the movie.

While I have no doubt that this hadn't been done at the scale and resolution, it struck me that I'd heard about this concept in a podcast episode [1] in which very early (1964) computer animation was discussed alongside the SC4020 microfilm printer that used a Charactron CRT which could display text for exposure to film or plot lines.

[1] https://adventofcomputing.libsyn.com/episode-88-beflix-early...


For Flight of the Navigator (1986) they sent renders of the ship directly to film because the system could only hold one frame at a time.

https://www.youtube.com/watch?v=tyixMpuGEL8


That was excellent, thanks for sharing it! I enjoyed the appearance of the Cray X-MP.

It's wild to realize that the version of the movie most of us remember emotionally is not the one that's currently streaming. There's something bittersweet about that... like your memory has a certain warmth and texture that modern restorations just can't quite recreate.

Beauty and the Beast on Bluray looks completely different from what I remember; I had assumed that they had just regraded it, but given that it was developed with CAPS, maybe this is part of the effect?

Film is magical. We should preserve and incentivise it.

“So, we have five or six extremely high-resolution monitors that have better color and picture quality.“ I would love to know what qualifies as extremely high resolution for computer monitors circa 1995

I dunno if it was manufactured as early as 1995 (though it couldn't have been more than four years later, absolute max) but some time around 2002 I picked up a pair of used 19" (IIRC) IBM flat-screen (that is, the screen itself was flat, not flat-panel) CRT monitors in a flea market that had both more total pixels and greater pixel density than any screen in my house until I started picking up devices with Apple Retina displays. I believe the short side of the 4x3 on that thing maxed out at 1920.

My boring, 17" consumer Trinitron monitor in 1995 could do 1600x1200 IIRC.

Max resolution and pixel density (plus, for a long time, color gamut, contrast/depth-of-black, latency, et c) on typical monitors took a huge dive when LCDs replaced CRTs. "High res" in 1995 would probably qualify as fairly high-res today, too, if not quite top-of-the-heap. Only expensive, relatively recent displays in the consumer space really beat those '90s CRTs overall to a degree where it's not at least a close call (though they may still be a little worse in some ways)


I remember those monitors, but I forget what resolution they were. For what it's worth, Toy Story was rendered at 1536 x 922. I believe they re-rendered the whole thing from the RIB files for the bluray release.

I think it’s important to remember it’s not just the quantity of pixels but the quality of them that makes the difference between consumer and professional equipment

1280×1024

This reminds me of the pains taken to reproduce original Game Boy Advance fidelity in the chromatic by mod retro https://modretro.com/blogs/blog/display-the-hard-way

Man, this makes me want to watch original 35mm releases of all these films. It is unfortunate that they are so hard to get your hands on these days.

This thread has a download link for toy story 35mm. Not sure if it works but maybe worth trying.

https://www.reddit.com/r/toystory/comments/1hhfuiq/does_anyo...


So it's fascinating reading this looking at the screengrabs of the "original" versions... not so much because they are "how I remember them" but indeed, because they have a certain nostalgic quality I can't quite name - they "look old". Presumably this is because, back in the day, when I was watching these films on VHS tapes, they had come to tape from 35mm film. I fear I will never again be able to look at "old looking" footage with the same nostalgia again, now that I understand why it looks that way - and, indeed, that it isn't supposed to look that way!

35mm (and even more so with 70mm) looks so much better than digital IMO. The colour looks so much more alive and thorough.

And I don't think I'm even being bitten by a nostalgia bug as per se because it was already a nostalgic fad long gone from any cinema near me when I grew up.


Excellent article really enjoyed it.

Yeah, this is the kind of thing that makes me really enjoy the internet.

I call it the Newgrounds animation effect. Digital-to-digital always looked a bit unserious to me.

I’m sure many young people feel the exact opposite.


> "Even so, it’s a little disquieting to think that Toy Story, the film that built our current world, is barely available in the form that wowed audiences of the ‘90s."

Load it up in DaVinci Resolve, knock the saturation and green curve down a bit, and boom, it looks like the film print.

Or you could slap a film-look LUT on, but you don't need to go that far.


You'd think there was a LUT you could apply to the digital copies during playback to make it look (more) like the original...

What an excellent piece! I thoroughly enjoyed reading it, brought my childhood memories flooding back. I have so many fond recollections of that 90s era, including "A Bug's Life." I remember gathering with my cousins at my grandmother's house to watch these films on VHS. Time flies.

This is insane to me. Hundreds of professionals work on these rereleases and nobody realized.

It reminds me also of the 24 FPS discussion, which is still the standard as far as I know for cinema, even though 48 or 60 FPS are pretty standard for series, The 24 FPS give it a more cinematic feeling.

https://www.vulture.com/2019/07/motion-smoothing-is-ruining-... https://www.filmindependent.org/blog/hacking-film-24-frames-...


To add, when it comes to video games sometimes people go "baww but 24 fps is enough for film". However, pause a film and you'll see a lot of smearing, not unlike cartoon animation frames I suppose, but in a video game every frame is discrete so low framerate becomes visually a lot more apparent.

I think it was The Hobbit that had a 60 fps version, and people just... weren't having it. It's technologically superior I'm sure (as would higher frame rates be), but it just becomes too "real" then. IIRC they also had to really update their make-up game because on higher frame rates and / or resolutions people can see everything.

Mind you, watching older TV shows nowadays is interesting; I think they were able to scan the original film for e.g. the X Files and make a HD or 4K version of it, and unlike back in the day, nowadays you can make out all the fine details of the actor's skin and the like. Part high definition, part watching it on a 4K screen instead of a CRT TV.


Regarding your first paragraph, the other thing worth mentioning is that in addition to smear/motion blur in film, it's also a passive medium. Video games are interactive. You're responding to and making decisions based on what you're seeing.

In a FPS, trying to track movement at only 24 fps is pretty much impossible unless your target's movement is entirely predictable.

In a flight simulator, trying to land a plane in gusty weather conditions is a lot harder with only 24 fps.

Lower framerates don't just make motion choppy, it increases latency. At 24 fps, any change in movement could be up to 42 ms behind. At 120 fps, that's down to 8.3 ms. And those numbers assume that you can notice the difference in only a single frame.

I'm convinced that people claiming 24 fps is fine for games just because it's fine for film don't actually play games. At least, nothing that requires quick reaction times.


I prefer the 35mm version, at least when viewing the scene with the soldiers.

It’s fascinating to me how many of these discussions boil down to dialing in dynamic range for the medium in question.

As the Aladdin still shows with its wildly altered colors clearly other aspects matter/are at play. But the analog/digital discussions always seem, at least to me, to hinge heavily on DR. It’s just so interesting to me.

Many of us remember the leap from SD->HD. Many of us also can point out how 4K is nice and even noticeably better than FHD, but man…getting a 4K OLED TV with (and this is the important part) nice DR was borderline another SD->HD jump to me. Especially with video games and older films shot and displayed on film stock from start to finish. The difference is incredibly striking.


If you're interested in these 35mm film scans, I recommend watching this excellent YouTube video "SE7EN & How 35mm Scans Lie to You" https://www.youtube.com/watch?v=uQwQRFLFDd8 for some more background on how this works, and especially how these comparisons can sometimes be misleading and prey on your nostalgia a bit.

If you're interested in making digital footage look exactly like film in every possible way, I'll shill our product Filmbox: https://videovillage.com/filmbox/


I remember it grainier and with occasional 'VHS zebra stripes'(?) too, though.

I have blu-rays for all the movies mentioned in the article. Today, there are some very powerful tools for processing digital images to make them look like film, going as far as letting you mimic specific film stocks.

It might be a fun experiment to make custom rips of these movies that look more like their theatrical releases. I'm curious how close one can get without needing to source an actual 35mm print.


Now there is the problem where many of my friends will take one look at a movie I started on the TV and say "ew, I don't like this movie, it's old" They don't realize they feel that way, viscerally, is because it's shot on film. How do I get people to watch film movies with me? They are far better anyway on average than many modern movies (from a storytelling, moviemaking pov, to say nothing about the picture quality).

Make them into a drinking game. We watched The Princess Bride the other day (never watched it), I think it's aged pretty well but then I'm old. But if they get bored, make it a game to have a drink or get a point or whatever for every sexual innuendo, lol.

Some films didn't age well though.

And finally, accept it and move on, ultimately it's their loss.


That's a good one for my younger friends. Most people my age don't really drink anymore other than 1-2 a night

Yes not only have the best movies aged well, but the old movies that you've never heard of (not the bad ones, there are plenty of those, but the good ones, which there are still plenty of) are BETTER than the new movies of today. It's unbelievable and it's creatively inspiring to me personally, even if it requires "time travel."

It's like visiting a museum, or a grand old architectural building. If you abstract even a tiny bit from the viewer immersion (which is great), you can't help but think "What craftsmanship! We don't get stuff like this anymore" And of course you also start to pick up on where everything came from, such that even the new stuff you love looks like a slightly sleeker yet baser, cheaper copy of the old.


Show them a Tarantino movie

Happy to have set my television to less than half saturation

Hmmm. I saw the first three toy story; the fourth one I never saw, even though people said it is not bad, but to me Toy Story was over with the third one.

Now, while I liked the first three, the first one always has a special place, because at the time it was really quite new-ish. Fully computer animated movies were quite rare. Pixar did several short videos before Toy Story, and I think there were some other movies too, give or take, but Toy Story kind of changed everything past that. Unfortunately many other computer-generated movies are absolute garbage nowadays. The big movie makers want money and don't care about anything else, so they ruin the interest of people who are not super-young anymore, because let's face it: older people are less likely to watch the latest marvel 3D animated zero-story movie that is a clone of prior clones.

It would be nice if AI, despite it also sucking to no ends, could allow us to produce 3D movies with little effort. I have a fantasy game world my local pen and paper RPG group built. Would be interesting to feed it a ton of data (we have generated all that already over decades) and come up with an interesting movie that relates the story of a part of this. This is just one example of many more. Quality-wise I still like Toy Story - I find it historically important, and it was also good at the respective time (all first three actually, although the storylines got progressively weaker; I did like Ken and Barbie though, but too much of the story seemed to go into wanting to milk out more money selling toys rather than telling a story. Tim Allen as Buzz Lightyear was always great though.)


Those comparisons were strangely jarring. It's odd to see (on the internet awash with "Mandela Effect" joke conspiracies) direct photo/video evidence that things we remember from our childhood have indeed been changed; sometimes for the worse!

Interesting, I think the film versions feel like they have more gravitas, especially the Lion king and Mulan scenes.

Tangential thought, but I wonder if this had any affect on the Toy Story level in Kingdom Hearts 3. I suppose that would depend on the reference materials Square Enix received from Pixar/Disney.

I find a lot of the stuff I remember from decades ago looks worse now. Toy Story in particular I watched when I got a projector after I'd seen Toy Story 4 and it looked bad, almost to the point I wish I hadn't tarnished my memory of it. Similar things have happened with N64 games that I cherished when I was little.

I don't buy that it's a real degradation due to different presentation methods. I'm sorry, but no matter what film stock you lovingly transfer Toy Story to, it's never going to look like it does in your memory. Same with CRTs. Sure, it's a different look, but my memory still looks better.

It's like our memories get automatically upgraded when we see newer stuff. It's jarring to go back and realise it didn't actually look like that in the 90s. I think this is just the unfortunate truth of CGI. So far it hasn't reached the point of producing something timeless. I can watch a real film from the 80s and it will look just as "good" as one from today. Of course the colours will be different depending on the transfer, but what are we hoping for? To get the exact colours the director saw in his mind's eye? That kind of thing has never really interested me.


> Same with CRTs. Sure, it's a different look, but my memory still looks better.

I don’t have this issue and never have. For whatever reason I’ve never “upgraded” them in my mind, and they look today exactly as I remember them when played on period hardware.


Ever since raytracing graphics cards came out I've wondered why no game studio has ever tried to recreate the look and feel of Toy Story. There's something about the combination of detailed lighting, high-poly models, but low-eyes textures and primitive animations that really appeals to me.

I don't think Toy Story used raytracing. I agree it's a great look though.

Don't tell me what I remember.

The changes in the Aladdin and Lion King stills surely can’t be accidental side effects? The Aladdin shot especially looks like they deliberately shifted it to a different time of day. Could there have been a continuity reason?

wtf happened to Simpsons on Disney+? looks like it's zoomed in.

There's an option to switch back to the original 4:3 ratio.

The simpsons was originally made in 4:3. Many people don't like watching with large black bars to the right and left, so they show a cropped 16:9 version. People complained because this is occasionally a problem and ruins a joke, so I believe you can opt into either.

A similar crime against taste as the pan-and-scan "fullscreen" DVDs of the early 2000s. If I want to pay to watch something, don't crop out a chunk of what the cinematographer wanted me to see...

David Simon talked about this for the HD release of The Wire:

https://davidsimon.com/the-wire-hd-with-videos/

It seems like the video examples are unfortunately now unavailable, but the discussion is still interesting and it's neat to see the creative trade-offs and constraints in the process. I think those nuances help evoke generosity in how one approaches re-releases or other versions or cuts of a piece of media.


There's a (much less severe) instance of that peeve with computer video player apps that have slightly rounded corners on the windows.

Pan and scan wasn't a DVD innovation. Most VHS releases were pan and scan too; DVDs at least commonly had widescreen available (many early discs came with widescreen on one side and full screen on the other... good luck guessing if widescreen on the hub indicates the side you're reading is widescren or if the otherside is widescreen so you should have the widescreen label facing up in your player.

I believe this is the best example of the problems that can be caused:

https://x.com/TristanACooper/status/1194298167824650240

Open both images and compare. The visual joke is completely ruined with the cropping.


>Computer chips were not fast enough, nor disks large enough, nor compression sophisticated enough to display even 30 minutes of standard-definition motion pictures.

This is not true at all. Being compatible with outdated, film based projectors was much more important for being able to show it in as many theaters as possible. If they wanted to do a digital screening it would have been technologically possible.


Right. Movies don’t fit onto a single 35mm film reel, and theatres deal with that as a matter of course.

It was possible, but much too expensive to get it into wide release that way.


I bumped on this too, since 1994-1995 was about the time when multi-gigabyte hard drives were readily available and multiple full motion video codecs were being used in games, albeit for cut scenes. Theater projector compatibility makes complete sense.

In 1994-1995, all the pieces for digital cinema were there, but they weren't integrated, and there were no installed projectors. The Phantom Menance was shown digitally.... on two screens. By the end of 2000, there were 31 digital cinema screens in theaters.

Digital cinema went with Motion JPEG2000 with high quality settings, which leads to very large files, but also much better fidelity than likely with a contemporary video codec.

https://en.wikipedia.org/wiki/Digital_cinema


> In 1994-1995, all the pieces for digital cinema were there, but they weren't integrated, and there were no installed projectors.

I agree with that. The article's quote from Pixar's "Making The Cut at Pixar" book was that the technology wasn't there (computer chips fast enough, storage media large enough, compression sophisticated enough) and I--along with the comment I replied to--disagree with that conclusion.


In period I was somewhat in charge of the render queue at a small animation company. I had to get rendered images onto tape, as in Sony Digibeta or better. Before that I had to use film.

We had an incredible amount of fancy toys with no expense spared, including those SGI Onyx Infinite Reality boxes with the specialist video break out boards that did digital video or analogue with genloc. Disks were 2Gb SCSI and you needed a stack of them in RAID formations to play video. This wasn't even HD, it was 720 x 576 interlaced PAL.

We also had to work within a larger post production process, which was aggressively analogue at the time with engineers and others allergic to digital. This meant tapes.

Note that a lot of this was bad for tape machines. These cost £40k upwards and advancing the tape by one frame to record it, then back again to reposition the tape for the next frame, for hours on end, that was a sure way to reck a tape machine, so we just hired them.

Regarding 35mm film, I also babysat the telecine machines where the film bounces up and down on the sprockets, so the picture is never entirely stable. These practical realities of film just had to be worked with.

The other fun aspect was moving the product around. This meant hopping on a train, plane or bicycle to get tapes to where they needed to be. There was none of this uploading malarkey although you could book satellite time and beam your video across continents that way, which happened.

Elsewhere in broadcasting, there was some progress with glorified digital video recorders. These were used in the gallery and contained the programming that was coming up soon. These things had quite a lot of compression and their own babysitting demands. Windows NT was typically part of the problem.

It was an extremely exciting time to be working in tech but we were a long way off being able to stream anything like cinema resolution at the time, even with the most expensive tech of the era.

Pixar and a few other studios had money and bodies to throw at problems, however, there were definitely constraints at the time. The technical constraints are easy to understand but the cultural constraints, such as engineers allergic to anything digital, are hard to imagine today.


> and multiple full motion video codecs were being used in games, albeit for cut scenes

Yeah, but we were still using MPEG-2 back then, weren't we?

They would have looked like utter garbage. Bitrates would have had to be so high that I'm not sure we would have actually had enough storage. I guess we could have shipped TWO hard drives.


I was thinking MPEG-1 until around 1996 when MPEG-2 adds interlaced and transport stream support, among other things. But you're right, MPEG bitrates for quality were too high for what I was thinking. More like QuickTime Video or Smacker, which were used in games like The Journeyman Project or Diablo 1.

Wow. Based on those comparisons they really do feel completely different. Really remarkable how such relatively simple changes in lighting and whatnot can drastically change the mood.

And here I was thinking of re-watching some old Disney/Pixar movies soon :(


hola

I just showed Toy Story to my kids. It looked really bad. Mostly textures and lighting.

I wonder if artificial grain would actually make it look better.

Like when the game Splinter Cell was released, there weee two additional ‘views’ simulating infrared and thermal cameras. Those had heavy noise added to them and felt so real compared to the main view.


Pixar did re-renders of Toy Story.

https://www.youtube.com/watch?v=6w4bzm6ewRQ


TL;DR: Linking to YouTube trailer scans as comparisons for colour is misleading and not accurate.

---

> see the 35 mm trailer for reference

The article makes heavy use of referring to scans of trailers to show what colours, grain, sharpness, etc. looked like. This is quite problematic, because you are replying on a scan done by someone on the Internet to accurately depict what something looked like in a commercial cinema. Now, I am not a colour scientist (far from it!), but I am a motion picture film hobbyist and so can speak a bit about some of the potential issues.

When projected in a movie theatre, light is generated by a short-arc xenon lamp. This has a very particular output light spectrum, and the entire movie process is calibrated and designed to work with this. The reflectors (mirrors) in the lamphouse are tuned to it, the films are colour graded for it, and then the film recorders (cameras) are calibrated knowing that this will be how it is shown.

When a film is scanned, it is not lit by a xenon short-arc lamp, instead various other illumination methods are used depending on the scanner. CRTs and LEDs are common. Commercial scanners are, on the whole, designed to scan negative film. It's where the money is - and so they are setup to work with that, which is very different to positive movie release film stock. Scanners therefore have different profiles to try and capture the different film stocks, but in general, today's workflow involves scanning something in, and then colour correcting post-scan, to meet an artist's expectations/desires.

Scanning and accurately capturing what is on a piece of film is something that is really quite challenging, and not something that any commercial scanner today does, or claims to do.

The YouTube channels referenced are FT Depot, and 35mm Movie Trailers Scans. FT Depot uses a Lasergraphics 6.5K HDR scanner, which is a quite high end one today. It does have profiles for individual film stocks, so you can set that and then get a good scan, but even the sales brochure of it says:

> Many common negative film types are carefully characterized at Lasergraphics to allow our scanning software to compensate for variation. The result is more accurate color reproduction and less time spent color grading.

Note that it says that less time is spent colour grading - it is still not expected that it will accurately capture exactly what was on the film. It also specifies negative, I don't know whether it has positive stock profiles as I am not lucky enough to have worked with one - for this, I will assume it does.

The "scanner" used by 35mm Movie Trailers Scans is a DIY, homemade film scanner that (I think, at least the last time I spoke to them) uses an IMX-183 sensor. They have both a colour sensor and a monochrome sensor, I am not sure what was used to capture the scans linked in the video. Regardless of what was used, in such a scanner that doesn't have the benefit of film stock profiles, etc. there is no way to create a scan that accurately captures what was on the film, without some serious calibration and processing which isn't being done here. At best, you can make a scan, and then manually adjust it by eye afterwards to what you think looks good, or what you think the film looks like, but without doing this on a colour calibrated display with the original projected side-by-side for reference, this is not going to be that close to what it actually looked like.

Now, I don't want to come off as bashing a DIY scanner - I have made one too, and they are great! I love seeing the scans from them, especially old adverts, logos, snipes, etc. that aren't available anywhere else. But, it is not controversial at all to say that this is not colour calibrated in any way, and in no way reflects what one actually saw in a cinema when that trailer was projected.

All this is to say that statements like the following in the article are pretty misleading - as the differences may not be attributable to the direct-digital-release process at all, and could just be that a camera white balance was set wrong, or some post processing to what "looked good" came out different to the original:

> At times, especially in the colors, they’re almost unrecognizable

> Compared to the theatrical release, the look had changed. It was sharp and grainless, and the colors were kind of different

I don't disagree with the premise of the article - recording an image to film, and then scanning it in for a release _will_ result in a different look to doing a direct-digital workflow. That's why major Hollywood films spend money recording and scanning film to get the "film look" (although that's another can of worms!). It's just not an accurate comparison to put two images side by side, when one is of a trailer scan of unknown accuracy.


Damn. I wish we could get the release of the 35mm colors in the way they look in the comparisons. The Aladdin one specifically looks so good! It makes me feel like we're missing out on so much from the era it was released.

> These companies, ultimately, decide how Toy Story looks today.

LOL, what? Anyone with a Blu-Ray rip file and FFmpeg can decide how it looks to them.


And how many people will have that? Eventually they'll just go "eat ze bug" and you'll have to eat shit they give you.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: