Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Consumer Electronics Hall Of Fame: Sony Trinitron (2018) (ieee.org)
123 points by proxybop on Dec 4, 2019 | hide | past | favorite | 118 comments


There's a whole community of "CRT enthusiasts" who search these out on Craigslist now, and buy them for retro video gaming setups. A big Trinitron CRT 27 to 36 inches in size, in good condition, is actually appreciating in market value now due to how many of them have been trashed.

Sony PVMs (professional video monitors), such as used in a tv production studio have been increasing in market value. Similar tubes but with more advanced electronics.

To the best of my knowledge there are zero remaining manufacturers of CRTs in the world.


Believe it or not our only TV set is from 1985 (Blaupunkt Scout T8 [1]) and we use it daily :-) We didn't have a TV by choice for the last several years and just used laptop when we wanted to watch a movie, but as I got more and more interested in stuff from the 80s, I started looking for an old TV as a fun project to try connect HDMI to it, play NES and so on. Long story short (because I could talk a lot about it) the TV somehow stayed with us for 2 years now and we don't plan to get rid of it anytime soon. It has a beautiful form factor, the image and sound quality is surprisingly good and we simply don't feel the need to change it for a modern one.

[1] http://www.hifi-archiv.info/Blaupunkt/1984-85/17.jpg


Visual neuroscientists too!

CRTs update in a nice, predictable order, can have crazy-high refresh rates, and don't do anything clever (or slow) with their inputs. This makes them perfect for experiments that require showing stimuli precisely and promptly.

More recent displays do all kinds of weird trickery that either changes what's actually shown (e.g., motion blur) or presents it at some hard-to-determine time.


It's probably just limitation of LCD. I remeber using laptop in 90s and refresh rate was so bad that if you moved a mouse it was not visible (the motion blur special effect for the mouse pointer in early Windows helped a bit with this, I was also wondering if maybe that was the reason it existed).

Since then LCD studios made huge progress, but maybe still can't compare with CRT.

OLED magnitude faster than LCD, I'm wondering how these compare.


Nah, our displays aren't that old.

Many models now include "smart" features that do things like motion blur, contrast adjustment, etc that probably do make the picture look better (i.e., "more cinematic") when used normally. These features just clash with niche uses, like vision research and competitive gaming, that would prefer pixel-accurate, predictable updates.

It's surprisingly hard to figure which consumer monitors have what features. There are "research" displays that don't have these features, are calibrated, update in a predictable way, etc, but they're eye-wateringly expensive.

In some sense, we were spoiled with CRTs, which were too dumb to do anything but sweep from top to bottom, left to right, over and over. Of course, it was much harder to generate dynamic stimuli back then too, so it's a mixed bag.


Sounds like a DSTN screen, as used on early ThinkPads: https://en.wikipedia.org/wiki/Dual_Scan


I don't remember much, but it might be a thinkpad, it had the nub instead of touch pad or track pad.


Sounds like the difference between passive and active matrix LCDs. Passive vs. Active screen was a big differentiator in cost and screen quality in early to mid 1990s laptops.

https://en.wikipedia.org/wiki/Passive_matrix_addressing


Seems like there's a reasonable market - I wonder if there's any companies still making new CRTs


VPixx sells a "CRT replacement" monitor that competes in that niche; it also has a bunch of TTL lines and other research-friendly features. It's very nice, but also pricy. I know one of them says "affordable", but it's by comparison. https://vpixx.com/our-products/crt-replacement/

Otherwise, there are a lot of threads like this where labs characterize stuff they've bought: http://visionscience.com/pipermail/visionlist_visionscience....


For Super Smash Bros Melee (released 2001), it's an existential issue. Many high-level interactions in the game move so quickly that issues playing on modern LCD TV's a non-starter. 1000 entrant tournaments need enough CRT TV's to support them. Tournament organizers often will keep a cache of CRTs specifically to host melee tournaments.


I have a friend at work that plays competitively and I just gave him an old CRT we had lying around as he stocks up for tournaments!


I imagine modern technologies like gsync or freesync have solved that problem for modern games, but there's probably no way to weave that into the pipeline of old arcade hardware.


The problem is the significant input lag most modern TVs have.


The bigger problem is sample-and-hold blur. When you watch a moving object your eyes move smoothly to track it ("smooth pursuit"). If the frame is displayed for more than a brief moment then the image on your retina will be smeared by that eye motion. For sharp looking motion you need either strobing/flicker, or very high frame rate, and many games don't support very high frame rates. Blurbusters has a good overview:

https://www.blurbusters.com/faq/oled-motion-blur/


I don't get why they don't make strobing LCDs. We have local dimming and LEDs can strobe well past CRT refresh rates, so why not strobe each local dimming region at e.g. 75Hz?


Sony's mid- to high-end LCD TVs have a feature they call "X-Motion Clarity" that pretty much does that -- local dimming zones independently strobing (at 120Hz) depending on picture content/motion. Seems to work well.


VR headsets do this. They are one of the most common applications due to how important tracking motion is, and also because they refresh at a higher rate by design - for content at lower refresh rates (60hz for a game) the flicker is very noticeable. If you've used an Oculus Go, some games run at only 60hz and the backlight flicker becomes quite noticeable.


You can strobe at a higher rate than the refresh rate; IIRC film projectors would strobe at 72Hz with a frame rate of 24Hz.


If you do that and the eye is tracking (as in the context, here) then you get ghost images rather than blur, which is still quite distracting. That's the reason that a lot of films avoid mid-speed panning shots.


Many of the gaming LCDs have an option to do that (some incorrectly label it as black frame insertion). It is painful to look at when active so no one uses it.


Search ULMB or Lightboost


That is becoming less of a problem than it used to be. There are upscalers that can, with zero lag, take an RGB signal and output something over HDMI that a modern display can use. And there are certain displays (generally computer monitors) that can deliver sub-frame input lag with used with a good upscaler. It's not QUITE the same as playing on a CRT, but it's good enough for speedrunners of many retro games.

Such a setup isn't cheap, but it's far more reliable and space-efficient than a giant CRT.


I recently purchased a RetroTINK-2X, which is an inexpensive FPGA-based line doubler. It works on a line-by-line basis (as opposed to frame-by-frame like most upscalers) and so lag is virtually nonexistent. I don't have any experience of the more expensive setups like the OSSC or XRGB but it works great for my old consoles.

That said, I still generally prefer to use my PVM CRT monitor for gaming for reasons already discussed in this thread. It really can't be beaten.


Not on modern TV's and monitors. They have less than 2ms latency now.


The vast majority don't, you can get one that does but you have to do your research. The "response time" listed in monitor/tv specs is not the same as the input lag.

TVs are especially bad, in the default mode some of them have ~200ms input lag. "Game Mode" usually brings it down to more like 20ms though.

Edit: did a google, don't know how accurate these are: https://www.rtings.com/tv/tests/inputs/input-lag


https://www.rtings.com/tv/tests/inputs/input-lag

It seems most get under 20ms lag when put into game mode, which I'd think is what matters to most people. At that point, when it isn't daisy-chained with any other lag (so a wireless controller delay will make a difference), it's imperceptible to me. I am not a professional gamer though, and from what I understand the higher refresh rates that pro gamers use (such as a 240 Hz monitor) have measurable improvements in their performance as gamers.


Haha looks like we googled the same thing at the same time.

This seems to just be TVs too - monitors get much better numbers. I find higher input lag to be much less of an issue when using analog sticks vs a mouse.

For me at least the difference between ~2ms and ~10ms input lag is very noticeable when aiming with a mouse, but basically imperceptible with say a wired 360 controller (I'm assuming the 360 controller isn't adding significant input lag itself though).


2ms sounds like maybe OLED.


Also, latency claims are often misleading. For example, the 2ms response time is limited to grey-to-grey transitions.

A 20ms response time still means the signal is dropping every fifth frame in a 60 fps game.


response time has little to do with latency, it's purely a property of the panel. At 60Hz refresh, an LCD will have a minimum of 16.7ms latency because it buffers an entire frame before displaying. 20ms of latency would mean there is additionally 3.3ms of lag, not that it takes 20ms for each frame (it's already buffering the next frame in those 3.3ms, so the next frame will be displayed 16.7ms later).

Many TVs buffer several frames for video processing in their default mode, which is why they can have 100s of ms of latency when not in game-mode; the response time is identical regardless of which mode it is in though, as the response time is purely a property of the panel.


Are you sure all LCDs buffer frames for a full frame duration before displaying? I haven't heard that before. I was under the impression that they basically start displaying them as soon as the GPU sends the frame - any buffering being done on the GPU.


Thanks for asking. I'm pretty sure I was wrong. At 60Hz refresh rates it takes 16.7ms to send a full frame to the display, so the minimum latency is 16.7ms even with a CRT (Which has no buffering).


> A 20ms response time still means the signal is dropping every fifth frame in a 60 fps game.

Could you reword this?


That's simply completely false. You are talking about transition times. To my knowledge there are zero monitors with less than one frame input lag, which is more than 4 ms even for the rare 240 Hz screens and around 7 ms for the average 144 Hz gaming screen.

An excellent gaming screen like a PG258Q might have GTG of 1 ms but at the full 240 Hz it still has about 5 ms of input lag.


How about plasma TVs as an alternative?


They are also somewhat popular with enthusiasts (especially the rare 4:3 models) but they frequently have software processing and scaling that adds the lag back.


OLED!

Lag might still be an issue depending on the electronics, but the screens have a roughly 10us response time.


OLED has insanely fast response time, yes, but as a result are the worst sample-and-hold artifact offenders. An OLED running at 1000 Hz is the holy grail, 120 Hz would be amazing, and 240 Hz would be 99% there. When the source content is lower than the display refresh rate you could add hardware that averages previous frames and adds one frame of latency. That might be acceptable if it makes motion appear more smooth.

Of course these are things we won’t see for decades.


You can buy OLED TVs with 120 Hz refresh rates running 1080p, and 60 Hz for 4k.

https://www.lg.com/us/experience-tvs/oled-tv/features

https://www.rtings.com/tv/reviews/sony/a9g-oled

That criteria has been available to consumers (at a premium) for a few years now. I suspect 240 Hz is only a few years away, but I admit that is a shot in the dark.


$1000 for a display is still prohibitively expensive for most people. I broke my TV recently and I picked up a $300 43" 4K LCD TV. I consider myself a display buff but I couldn't find $1000 in my budget for a TV.

The panels will get cheaper, but the links will start to become seriously cost prohibitive. If you want 10 bpc 4K240 you're looking at 60 Gbps. I'm curious when the consumer optical renaissance will hit. 2030?


120Hz 4K is already here, basically.

Not much can output that, but it's here.


Ugh my grandparents gave me a modern 36in Trinitron when I was in college. I ended up leaving it with one of my old roommates when I moved out, mostly because I didn't want to move it again. It felt like it was made out of solid glass, it had to be near 200 pounds (probably exaggerating, but it was heavy and extremely awkward to hold while moving with two people). I'm kinda wishing I still had it now.


Not sure about which model, but 200 pounds is actually a bit light for a 36" CRT.

https://www.cnet.com/products/sony-kv-36fs12-wega-36-crt-tv/

(It's 222 pounds for those who don't want to click the link).

I used to lug around a 27" CRT by myself every time I moved, and it was a giant pain in the ass. It was worth it though for the picture quality.


→ 101kg (I suspect the figure in pounds, 222.66lb, has been converted from this.)

And for that matter, 36" ≅ 91cm, 27" ≅ 69cm, although strangely TVs are still advertised in inches in many otherwise-metric countries.


At first I wasn't sure why you were making this comment, then I realized many HN readers might not be familiar with US-centric measurements.

I guess I'm Americentric.

https://en.wikipedia.org/wiki/Americentrism


If you think that's heavy I've seen people selling 40" CRTs, which I think weigh around 315 lbs


> it had to be near 200 pounds (probably exaggerating, but it was heavy and extremely awkward to hold while moving with two people)

Only as anecdata, I stil have and use daily a "huge" Sony 32" trinitron as TV:

https://news.ycombinator.com/item?id=16060031

that is 75 Kg/165 lbs, so 100 Kg or more for a 36" seems just right (as said on the other thread I have a special table/support for it).

And BTW the only reason why I changed my good ol' (as well trinitron) 20" inch computer monitor with a "flat" 22" screen is because aging I see better the screen at a slightly increased distance (i.e. thanks to the reduced thickness I can have the screen some 30-40 cm farther on my desktop).


> it had to be near 200 pounds (probably exaggerating

You are probably pretty close based on the details here: https://en.wikipedia.org/wiki/FD_Trinitron/WEGA


We have a Sony 36" 4:3 HD Trinitron in the basement for the kids to play old Nintendo and Sega games on. It weighs 238.5 lbs without the stand. I can barely pick it up.

When we moved from a 3rd floor walk-up to our house, one mover used several straps to strap it to his front and carried it down the stairs by himself. He was a beast, as broad as he was tall.


As someone who spent a few years working in home entertainment retail I can confirm that Trinitrons were by far the heaviest CRTs of any given size. They were truly excellent, though.


I wonder if it's possible to wind a new yoke and make a vector monitor out of a Trinitron. That would be the ultimate gaming experience.


The Vectrex was a video game that came with its own vector monitor built in. It was only black and white and relied on plastic screen overlays to get color in different parts of the screen.

https://en.wikipedia.org/wiki/Vectrex


>retro gaming setups

Do consider the OSSC (Open Source Scan Converter). It works line by line, so it adds virtually no latency to the chain. It is OSHW and FPGA based, getting regular updates.

It doesn't do s-video or composite, but otherwise it is imho much better than the closedhw framemeister, which buffers entire frames and is less flexible.


Hook it up to a RetroTINK-2X in passthrough mode and you'll get lag-free composite and s-video too!


While retrotink's likely about the better option for composite and s-video, it's unfortunately not OSHW.

There's hope they'll add support for composite/s-video/rgbi to OSSC at some point, be it a new revision or a daughterboard. I'll survive until then... so far I haven't needed those inputs fortunately. If I can mod the source to output rgb I'll always do that.


> To the best of my knowledge there are zero remaining manufacturers of CRTs in the world.

I have seen plenty of stores selling (small) CRTs in stores in Asia. They are usually quite small, and I bet they are quickly disappearing, but they are still around.


Also, Duck Hunt only works on CRTs.


The gun relied on the phosphor fade to detect where it was aimed. Since the electron gun was scanning, the timing of the peak light detection would tell you what point was in the gun's focus.


The gun used a photodiode which waited for the inserted frame signal after a shot to see if a valid target was what you were aiming at (you can see this emulating Duck Hunt and slowing down the game framerate, there's a bright flash of light over the ducks which the lightgun, if aimed properly, will detect, sending the signal back to the console and basically telling it where in the in-progress frame it hit.) You could easily defeat this by pointing the gun at a regular incandescent light bulb - the game would always register a hit at the first valid target in order of how the screen is drawn. If you got both ducks at once, whichever one was closest to the upper left bounds was the first to down, and the remaining one would be the second to down. Its the only way I've gotten to the kill levels in Duck Hunt when I was a child.

https://youtu.be/DzIPGpKo3Ag - video of actual modification of zapper to LCD compatibility, with demo of how the Zapper actually works.


A casual search turns up https://www.thomasnet.com/products/cathode-ray-tubes-crt-885..., with 23 CRT manufacturers as of now.

Are the products they sell different in some important way?


Those look like military/aerospace suppliers, i.e. astronomical prices and very specialised designs. Also, picture tubes are CRTs, but not all CRTs are picture tubes.


Some are specialized, but e.g. https://dotronix.com/ looks like they offer CRT monitors to businesses and would presumably not be insanely expensive.


I dont own a trinitron, but own a huge and heavy crt for my snes gaming needs. It was free at a garage sale.


I've been picking up Sony Vegas that I see on the curb and can usually sell them for $100.

My back is going to make me pay as they are about 160-180lbs for the 32" models


> A big Trinitron CRT 27 to 36 inches in size, in good condition, is actually appreciating in market value now due to how many of them have been trashed.

Yeah, not for very much. And certainly not for an amount worth dealing with the "You should take money off that because it doesn't have HDMI. Hurr-durr." crowd on fleabay.

When I see my Sony GDM-FW900 boat anchor actually selling for more than $1000, then I'll believe you.


There's a huge amount of variance in CRTs and the last Trinitrons made are the holy grail in terms of features and display quality.

Specifically the BVM-D24E1WU & BVM-D32E1WU. They usually go for a few thousand dollars un-calibrated, with tube issues, and without the required addon cards (broadcast trinitrons have add-on cards giving different features like PCs) and master broadcast remote control receiver. The RGB module for these specific models is practically un-obtainium, with less than 150 produced globally, and everyone looking for these monitors now needs one. They sold for over $20k when they were new twenty years ago.

If you're looking to get one fully-loaded and in proper shape, expect to drop 6-8k. And there's only one guy still doing routine service of these who has adequate spare parts & knowledge, he's in Southern California and he won't ship the best models because of the huge risk of damage.


There are companies which specialize in shipping extremely valuable and fragile stuff. Such as $300,000 core routers for ISPs, in special plywood crates permanently strapped to a pallet. You certainly can ship them, I imagine that there must be some subset of high income crt enthusiast who will pay $2000 for delivery...


Yep, but I've also dealt with some of those companies beraking that super fragile stuff too. Most of the things they ship are insurable and replaceable, but these items can't be insured for their full value and are NOT replaceable. They haven't been manufactured in 17+ years.

It's more a matter of the guy doing the service personally not being comfortable shipping these rare items.

I'm sure you could do it and indemnify him against potential loss/damage, but he goes out of his way to discourage you from using shipping.


That specific model of Sony, I have literally seen sell for more than $800 in the enthusiast market. Search for sold items on ebay.


Yes, it's up to $999 now. I bought mine in 2013 or 2014 for $900, in excellent condition, and the seller's next one got listed for $1K, so there's not much of an increase.


That's... Insane. I've got 3 Samsung CRTs and a Benq collecting dust for more than a decade. No one gives a flying fuck about them here, could probably get a dozen more for like 10 dollars each...


Oh yeah, almost anything besides that one is way cheaper, even a big 1600x1200 model. For what it's worth, doing the Pantone Color IQ test on that screen was a triviality compared to the same on the Dell U2410, which is pretty easy compared to performing the task on a cheap temporal dithering office LCD in 2014.


No mention in this article of the famous support wires found in Trinitron PC monitors.

These wires were needed in all Trinitron products, but the more exacting demands of high resolution (1024 lines!) meant the support wires were quite visible horizontally, especially on uniform backgrounds like a gray Windows desktop. They divided the screen into three equal areas.

Despite this apparent flaw, Trinitron still had huge market share amongst enthusiast PC owners. Maybe it was brand loyalty, or maybe they really were much better quality than any other CRT out there?

At least as I remember it, if you wanted to take (or be seen to be taking) color reproduction seriously, then you had to have a 21” Trinitron. I guess the black horizontal support wires became just as much a part of brand signaling as the RGB lozenges in the logo. Lots of people I knew had amazing Trinitron CRTs and all they did was edit Word documents!

Kind of like a Leica red dot, or the white spot on Dunhill pipes.

https://cdn.hswstatic.com/gif/q406.jpg


You could tap the monitor and all the strings would move in this wave-like manner. The two horizontal wires were so cool to have. They were the sharpest CRT monitors.


When I worked at Best Buy circa 2000 we got lots of people who tried returning those because of the "lines" on the screen.


There's an interesting video about the history of Trinitron.

https://www.youtube.com/watch?v=0aFhzGEBQlk


I love this guy. He's just wacky enough to be fun to watch by not so much that it's annoying. And his videos have the right amount of technical minutiae; detailed enough that you understand the important bits without having to pause the video for a while to chew on the explanation.


His content is most refreshing because he gives an engineer’s level of understanding. Most “tech personalities” don’t seem to dig very deep into the whys and hows of things.


Yes! I was waiting for a reference to Alec's channel. So good!


Odd that they spend so much time talking about alignment, because all of my trinitron based CRT monitors over 20" needed to be adjusted seemingly on a daily basis to converge correctly across the whole screen. This was something that I rarely remember doing with the cheap shadow mask monitors (although frankly they were generally smaller screens) and at a lower resolution.

The trinitron monitors I had were running 1600x1200 or better and with a small font the convergence would give white text a purple shadow/etc. Very 1980's apple ][, which a lot of people seemed to be OK with, or maybe its just because they ran much lower resolution TV signals or much larger fonts.


Are they flat screens? Flat as in the glass is not curved of course, not flat as in thin. The flat ones are well known for developing a whole host of convergence/geometry issues that the curved ones either don’t, or take much longer to manifest.


"vertically flat"

Yah, by the mid/late 1990's when I had a pile of them at work/etc they were mostly "flat" screens because that was sort of the default for a midrange+ monitor in that timeframe.


Drifting convergence is usually caused by failing (if old) or poor-quality (if new) capacitors.


My mother-in-law inherited a mid-90s Trinitron TV when her father died in 2005. It's been the TV in their guest room since.

Their flat panel TV died a couple weeks ago and they put that TV in their living room as a stopgap.

The picture tube on that TV is still so good after all this time, they now don't want to bother buying a new TV.


Do they only watch broadcast TV? Even DVD's look far better on my LCD flatscreen (though the TV or player may be upscaling to 720 or 1080) than on the old CRT in my parents house (admittedly not a Trinitron)


I don't know what kind of nostalgic glasses people wear, but CRT TV quality degraded over time, became noticeably blurry. Had Panasonics, Sony, JVC, all looked like shit. Monitors with degaussing were fixable for a short time.


Most modern enthusiasts replace dried-up capacitors and retune the bias/constrast and focus voltages, which brings them back to stock condition as long as the tube isn't too heavily worn (a rejuve will sometimes fix that as well)


Often it's just a case of tweaking the focus knob on the flyback.


Damn... I can't believe I didn't consider that. A thing to remember, thank you!


Every CRT TV I've come across still looks fine. They still use them for Smash Bros tournaments.


Here’s a comment from down the rabbit hole:

>Back in the day I spent an afternoon configuring my old 19" CRT like that. I ended up with settings like 800x600x167Hz, 1024x768x133Hz, 1600x1200x89Hz and 1920x1440x73Hz. Many refresh rates were much higher than the stated documentation, and I ran it for years like that.

I’m amazed at that those resolutions and refresh rates were achieved so long ago. LCDs were so thin as to ge unstoppable, but definitely came with trade offs.


Had a 21" Hitachi Superscan Supreme and needed to do the same. Got it from a gamer who couldn't use it because most games at the times programmed the hardware directly, und thus anything you've set up with tools like Scitech Display Doctor/UniVBE/UniVesa got overwritten. Whereas i couldn't care less about that, happily enjoying two pages of DIN A4 in original size next to each other, crisp and clear on that highend monster :)


I used to configure my own non VESA resolutions in the X windows config file.

If the monitor had enough bandwidth you could push it past the standard VESA modes.

I used to run bespoke resolutions like 1600x1200 in the mid 90s on a big Trinitron monitor I bought at a surplus store.

IIRC the only limitation was that the resolutions had to be both divisible by eight.


Honestly RAMDAC was more the limiting factor.


Matrox could do it, later all-in-one Voodoos from 3DFX also, and anything with the shiny transparent blue chips where you could see the die from IBM [1], many better cards with chips from S3 had that. No blurryness at all.

[1] https://upload.wikimedia.org/wikipedia/commons/c/cf/IBM_37RG...

edit: link


Growing up in the 80’s and 90’s with Trinitron TVs and Walkmans, seeing the SONY logo even to this date evokes a deep sense of trust, brand loyalty and pleasure in using their products. I think there is tons to learn from companies like SONY, Braun, Apple, IBM (old IBM that is). They innovated relentlessly, put excellence in design as part of the ethos, understood UX/UI, and deeply cared about the customer experience. Trinitron TVs were part of my childhood, a window into the world of imagination...in color :)


>80s/90s

>Deeply cared about the customer experience.

>Cared

And then they started installing rootkits on their customers computers.


I wonder if their change of attitude coincided with them becoming more of an American company rather than Japanese.


Change of attitude may be when they bought movie companies and around the time the Playstation came to market. This put them in the middle of the conflict of interest between media producers and media consumers.


Eizo Nanao Trinitron monitors where the absolute best. example: EIZO FlexScan T965 with FD Trinitron https://www.cnet.com/products/eizo-flexscan-t965-crt-monitor...

Eizo monitors are still very good and I use them. They are very ergonomic and their settings, especially preset values just work.


> and deeply cared about the customer experience

I'm guessing you missed the whole Betamax thing?

That's tongue-in-cheek though. I think the Betamax experience is what motivated them to listen to consumers.


Until their phone-home rootkit CDs[0]. Or is that also “listening to your customers?” /s

[0] https://en.wikipedia.org/wiki/Sony_BMG_copy_protection_rootk...


Also, the VAIO line of laptops, while stylish, came to me to represent a total lack of support or interest in the customer aside from selling them a cool-looking machine.

Maybe it's related to the root-kit DRM, etc -- on looking back, maybe this reflected a mismatch of a company culture that was stuck in the hardware-focused customer experience (and where careful thought ended with the hardware), while the software was some add-on nuisance that would work itself out.


Some of those VAIO laptops really pushed the envelope, in terms of packing a lot of computer in to a small package. They were great when everything worked, but could be a real pain to repair.


Sony had a golden age in the 80s when they were making quality hardware for reasonable prices. It was unfortunately brief.

By the late 90s, Sony was such a huge company with so many competing interests that I'm surprised they didn't accidentally sue themselves during the digital music fiasco.


I think they were genuinely listening to their customers with Betamax, it's just a situation where the customers didn't use a product the way they said they would.

Of course we customers think image quality is important! Well, turns out, people are pretty cheap and were totally happy with sacrificing image quality of TV recordings in order to maximize the amount they could record on their, like, $20-in-1970s-money tape.


VAIO used to be a symbol of quality and status, as well. Really great laptops, how did Sony let them get so bad... They sold the brand, too.


This only touches on the CRT portion of the TV. I had a 19" Sony from '82. While other manufacturers were trying to save costs with fewer and less heavy duty parts, because the quality of the picture was better, Sony could charge a premium and did not skimp on the rest. It's evident when you look at the tuner/RF, power, and chassis what a quality product it was. Coming from Soviet TVs that would start on fire when they worked, it was no comparison.


I finally got rid of my 1970's Trinitron TV because LCDs were so much lighter and sharper and didn't warm up the room. I used it for something like 25 years.


Someone paid £1400 for a 32" widescreen sony crt with digital tuner in 2006. By 2009 they ebay'd it to me for £32. A pound an inch. Shows how fast the market collapsed when flatscreens became common. (Good tv, only stopped using it last year when the lack of hdmi got too much).

Incidentally why did europe get a last generation of widescreen CRT's but the USA basically went from humoungous 4:3 CRT's straight to wide flatscreens?


There were a few Stateside too. I owned a ~30" widescreen flat CRT (secondhand) for a while before I got a cheap LCD.


And Sony Walkman/Discman! To me Sony in the 80s/90s was like today's Apple. In 2000s I bought a small Sony box that could stream Comcast cable TV from my house to my PSP when I was out and about so that I could watch my own TV while playing poker at my friend's house.


Jobs loved Sony products. I'm sure they were an inspiration to create good products.


Some people in the 1970's converted them into inexpensive color computer graphics monitors.

I don't mean hooking up to the RF or composite video input, but directly connecting to the guts of it to get a crisp bitmapped image.


It's a thing people still do to this day, especially to use with game consoles modified to output RGB directly. https://shmups.system11.org/viewtopic.php?t=56155


Atari's early arcade cabinets used gutted consumer TVs around the same time.


My computer CRT uses the trinitron style of aperture grille. VX920, 19", 1600x1200@75. Still looks great today alongside my newer 1440p144 flatscreen.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: