I hate posting about Pixar stories because the Steve Jobs omerta still lives strong in me... but what the hell.
I couldn't watch the video because I'm on a train - however, I worked at Pixar from 2002 - 2008, so I'm pretty sure I know exactly what this video is about.
This story was legendary inside the studio - I never thought it would get outside the gates. In some ways, the fact that this had happened in the past, when the assets for a film might actually fit on a single workstation, was great.
Everyone was more careful. The Systems group was great at backups, recovery, and did an amazing job the whole time I was there, and I'm sure this story always sat in the back of their minds as a reminder of why they spent all the extra time and money.
The one time I accidentally locally deleted a day's work, I had a recovery in my home directory in less than an hour.
Have you read The Pixar Touch? If so - what did you think? The feeling I got from the book, was that Steve Jobs was really bad when it came to the vision for Pixar. He had 0 faith in their movie making abilities, but the Pixar team managed to 'wag the dog' long enough for them to actually produce movies and validate their reason for being to the rest of the world.
The guy wrote million dollar checks every year for 10 years to keep that company running. If that's not a big leap of faith ( losing money for a decade?!) then what do you consider faith.
The vibe I got from the book, which is why I asked the OP's opinion was that Steve Jobs did not believe in their movie business, his vision was that Pixar would market computers capable of producing Pixar-level graphics to ordinary consumers. He didn't see Pixar as a movie studio.
But the folks at Pixar wanted to make movies. So what they did was convince him that by doing work on movies and commercials, they could advertise the capabilities of their tools.
Eventually, they were able to buy enough time to get their work recognized and the Toy Story deal going which validated their vision and won Steve Jobs over to the movie studio idea.
I'm curious which version of Toy Story 2 this was done too as the film was originally slated as a straight to DVD edition.
When it turned out Disney wanted to do a screen release, there was a mad rush as the first iteration of the movie wasn't that great and Pixar had to get John Lasseter off his break to "fix the film"
(there's an interesting story how they basically re storyboarded the film in a weekend with sharpies)
The rest as they say is history. The second film was more successful than the first at the box office.
EDIT - thought I'd share a source. The Pixar Story, you can watch the appropriate part here.
There's nothing like that sinking feeling after an accidental delete, disk crash, or otherwise losing data in some form. It's scary, even if you're running backups. Then to find the backups are bad is just about the worst feeling one can get. Then it becomes a treasure hunt through every hard drive in every computer you've ever owned.
Not fun at all. The kind of pain that brings a grown man to tears.
Indeed. It's one thing to have backups and a backup system, but you need enough contingency planning forethought to test that your backups work on a regular basis - ideally by restoring to a secondary site in the case of catastrophic failure to your primary site (e.g. fire, earthquake).
Backups not working when you need them is very very common.
I've heard, experienced (for a client) and read it so very many times. In fact someone told me that pretty much all backups are useless unless you test them frequently (not just once!).
There are lots of tools that do backups - not so many that check if the backup actually works. This might be a business opportunity for someone.
It's not enough to check if the files wrote successfully - you also have to check that you actually backed up the files you need. And that you didn't miss any.
Something like a full restore with unit tests to make sure you actually have the data you care about.
Best pattern I've seen to guarantee backup availability is for upgrades/deployments to occur on the hot failover first, swap the hot failover to primary, move primary to hot failover status, then upgrade what was previously the primary. This, combined with your general data backup (database dump to tape, etc.) gives you much more resilience should something go badly. However, it doesn't take the place of regularly testing your data backups, and only works if you have a full failover environment.
The best backups are the ones you take home at the end of the day. Our IT team is quite small and we don't have Pixar money, but we are able to afford a few USB drives. The important stuff to keep a company running (tax records, receipts, etc.) are copied to a folder. It's not exactly terabytes of wireframes and renderman files, a few thousand office documents takes up 1GB at the most.
We tried backup software but each one stored it in a proprietary format and we had no way of checking files to see if they were valid or not. And testing a restore was way too much work. USB drives work because I can see the files, I can open the files and that's good enough for me. KISS
If all you have is 1GB-100GB, put it on Dropbox connected to 2-3 computers and forget about it. If you have 1TB or more data, look into Open-E SANs. They are very competitively priced.
I am vehemently against take-home-backups. You may trust your team members but they can still be robbed, get their cars/homes broken into, or just simply lose the drives. While chances of smart-criminal exploiting your data to gain competitive advantage is very low, the chances of your private information being sold to others is very high - someone can get your customer CC#s, employee SSNs etc. very easily. How often do we hear about an employee lose a laptop or HDD that contains 2 million CC#s and SSNs in an Excel or Access file? Please don't set yourself up for that.
With respect to backup software, my favorite way of backup on Windows is robocopy. I've been using it for 4+ years and it has worked flawlessly EVERY SINGLE time for me. I have very simple DOS scripts that mirror the entire source folder to destination\YYYY-MM folder. Since I only have about 300GB of data and 8TB of storage, I also mirror the source to destination\weekday folder. What this gives me is the ability to retrieve files from any day in last 7 days, or any past month (depending on disk space available). And destination\YYYY-MM always has last night's data.
If you have databases, dump the DBs once a night into one of the SrcFolders. Now you have pretty good backups of your DB too. HDD space is so cheap. I have 3 separate DestServers where I backup to. After the initial backup (and once a month), it's pretty fast each night. If you want remote backups but don't have the bandwidth to copy 20GBs a night, Open-E SANs work great for that. Of course, you can also use DeltaCopy (which is just Rsync for Windows).
If you are keeping customer CC#s in the first place, you are already violating a ton of laws and credit card agreements.
Our clients are big companies, we don't have nor use credit card machines. Strictly purchase orders and 5-figure checks each month. We are in another country and SSNs aren't used the same way like they are in the US (tied, to your bank, credit, retirement, etc). The most someone could do with an employee ID is buy products with our tax id, we'll happily take a VAT refund for you, or someone could use your SSN to apply for a job (can be problematic around tax time but not end of world).
We do have 2 NASes (one backs up the other) that do dailies. The USB drives are for office on fire/big theft tragedies. I'm trying to get us on cloud storage, but pushing data over a DSL connection was frustrating for just 20 gigs. And the connection died over a weekend. Far easier just to do the USB route. You still can't beat bandwidth of a local HNer living 10 minutes away with a bag of drives.
I second this one. If you have personal data on people, make sure backups are encrypted. Also, if you are dealing with medical records of any type, make sure you comply with the law.
In the 90's, I was working for a gov research grant (social program) at a community college. It was a research grant and we only synced data with DC once a month (think dial-up). We rotated backups in three places: bank lock box, safe in IT person's house, and a safe in my apt 2 hours away. We were in a 100 year old building and we were kinda worried.
The gov site visitor was running one of the other programs and "wrote us up" for taking up too much time / effort with backups. we protested and got ourselves cleared to continue. Eight months later I was told that one of the sites lost all their data in a flood. Guess keeping all your data onsite in the basement doesn't really work.
Would be nice to see this in html5 HD as advertised. Kind of off topic, but did anyone else try playing this with html5 video in the Chrome dev build for Linux? The controls keep disappearing before I can adjust them...
so... they actually allow personel to make a copy for personal use from a not-jet released movie? And then they wonder how these things come up on various download sites?
From the sounds of it, it sounded like a pretty high up person? If it were everyone allowed to why would they be so concerned about the one persons copy, wouldn't there be a lot of copies?
It's not Ebola, and these are all people who worked on Toy Story. John Carmack isn't going to leak Doom 4. I'm pretty sure he can be trusted with that one.
Why would you give away your life's work to some thieving bastard on the internet who you'd never met and didn't care about you, anyway?
If you were a junior programmer at id, would you torrent your work? I doubt it. Leaks nearly always come from outside the company, meaning copies sent to reviewers etc.
If you have a guy working for you who would leak something you're working on, the solution isn't to not allow people to take things home, it's to fire people who would harm your company.
Even even if it's un-rendered, we're still talking TBs of meshes, scenes, textures, mattes, etc. That part didn't make sense to me. Maybe it was in the storyboard phase.
I couldn't watch the video because I'm on a train - however, I worked at Pixar from 2002 - 2008, so I'm pretty sure I know exactly what this video is about.
This story was legendary inside the studio - I never thought it would get outside the gates. In some ways, the fact that this had happened in the past, when the assets for a film might actually fit on a single workstation, was great.
Everyone was more careful. The Systems group was great at backups, recovery, and did an amazing job the whole time I was there, and I'm sure this story always sat in the back of their minds as a reminder of why they spent all the extra time and money.
The one time I accidentally locally deleted a day's work, I had a recovery in my home directory in less than an hour.