Hacker News new | past | comments | ask | show | jobs | submit | more schizoidboy's comments login

My favorite quote from Ioannidis:

“Science is a noble endeavor, but it’s also a low-yield endeavor,” he says. “I’m not sure that more than a very small percentage of medical research is ever likely to lead to major improvements in clinical outcomes and quality of life. We should be very comfortable with that fact.”

https://www.theatlantic.com/magazine/archive/2010/11/lies-da...


A competent medicinal chemist can spend a lifetime in drug discovery and never get a candidate molecule out of clinical trials. I learned this from the excellent Derek Lowe, whose blog at http://blogs.sciencemag.org/pipeline/ is on my daily list.

Quote from an interview[1]: "I’ve been doing this for 27 years, and I have never once put a drug into a pharmacy. I tell people: “If you want to know why your prescriptions cost so much, it’s me.” I’ve done nothing but spend money the entire time."

[1]https://www.statnews.com/2016/03/05/derek-lowe-chemist-blogg...


> “If you want to know why your prescriptions cost so much, it’s me.” I’ve done nothing but spend money the entire time."

Sounds nice, the best evidence we have disagrees: https://news.ohsu.edu/2017/09/11/how-much-does-it-cost-to-br...


That's a widely criticized study because it ignores the cost of failure. It only looks at the r&d investments of companies that received FDA approval for a drug. It's like doing a study of the odds of winning the lottery, and only analyzing people who won the lottery

Considering the fact that 90% of phase 1 drugs never get approved and that study seems quite biased. That study conveniently assumes that all those medicinal chemists, who've worked decades and never seen a drug approved, simply don't exist


Interestingly this contributes to why drug companies spend so much money on advertising: it has a predictable measurable return, whereas R&D is virtually a lottery.


what's really crazy is how much money it takes to not make much progress, either. Modern labs are washing cash down the drain.


Is there a shortcut? I'm sure some improvements can be made but I am skeptical of the idea that science is just doing it wrong.


>Is there a shortcut?

Controversially, I think we need to do less research. The publish-or-perish culture has created a perverse incentive to crank out junk papers. Most working scientists will privately admit that most research isn't actually advancing our understanding of nature, it's just a desperate effort to dredge up something sufficiently novel to publish. Conversely, there's a substantial amount of research that is potentially useful to clinicians, but is languishing unread in some third-tier journal. Most research is never published at all because it supports the null hypothesis, but we can't do good meta-analyses based on a cherry-picked set of studies. We're glutted with data, but we have a remarkable paucity of actionable information getting to the people who need it.

The problem isn't that scientists are doing a bad job, but that the funding mechanisms of science incentivise the wrong kind of work. We should be focusing a far larger proportion of our funding - perhaps even a majority - on replication, meta-analysis and dissemination. Primary research is only one small part of the information architecture of science, but it dominates our spending.

In the case of nutrition, we're spending huge amounts of money on figuring out whether coffee increases or decreases your life expectancy by a fraction of a percent, but almost nothing on behavioural research to figure out how to stop people from gorging themselves to death on food they know to be terrible for them. There's a morbidly obese elephant in the room, but we're preoccupied by the micronutrient-rich mice scuttling around the periphery.


This is the only good reply I've seen so far.

I think a lot of information economies suffer from oversupply. I've heard it said that there are too many books, too much music, too many different open source projects trying to solve the same problems, and so on. It causes information overload on the demand side and perhaps paradoxically increases the odds of something genuinely important being neglected.


>but almost nothing on behavioural research to figure out how to stop people from gorging themselves to death on food they know to be terrible for them.

I'd also say the converse is true. Corporations spend lot on how to get you to eat their low cost high profit food that is not good for you. Think if we treated junk food like cigarettes in many countries where they had to be in a generic white box and no advertising.


Junk or not, if someone's daily caloric intake is in the sky, they can overeat themselves on the most non-junkiest of foods. Let's say a lot of vegtable seeds, superhealthy all organic bread, some premium right out of the cow diary stuff, and some fruits, just to get more sugar.


I don't think there's a shortcut. There would have to be a massive change in approach. In my field of expertise, molecular biology, it;'s common to just run experiments over and over until you get a positive result- throwing away thousands of dollars in gels and other things along the way. This isn't even intellectually honest (it's fishing for significance) but it's what nearly everybody does to get publishable results.

As a response, I work at a company that is automating biology, with a goal of making much more reliable clean data for machine learning, but even then, it's very expensive. A decent robot arm costs well over $50K (I could build an equivalent for $1-2K, but at a cost of $100K of my time) and all the other equipment is often in the $100K+ range. Just to automate what you could hire a human to do for $75-100K a year!

So yeah, some scientists are doing it wrong, but even the folks doing it right are still wasteful. I think it's endemic to the enterprise, but we could still do better.


How can we find out more about your company?



There's not really a shortcut to the fact that science is difficult.

But more quality science and less garbage science would help. Doing 10 studies with a small likelyhood of any of them having a true result is less helpful than 1 high quality study that almost certainly is true.


> Is there a shortcut?

A much, much deeper understanding of how humans work.


Facepalm! How could we have forgotten to pick that up before heading down this road of frustratingly hit-or-miss basic research?!


Maybe the real research was the friends...


And how do we acquire that?


When you're looking for something, you always find it in the last place you look.


Summarized by Nietzsche: "He who has a why to live for can bear almost any how."


I think this gets to the heart of it. Some places to start:

Meaning in Work TED talk by a researcher in the field: https://youtu.be/RLFVoEF2RI0?t=177

Happiness vs Meaning: https://dx.doi.org/10.1080/17439760.2013.830764

My favorite survey of the field: https://dx.doi.org/10.1080/17439760.2015.1137623


Actual quote from Linus as opposed to the click-bait title:

> I still wish we were better at having a standardized desktop that goes across all the distributions. There's been some progress there. I mean, this is not a kernel issue, so this is just more of a personal annoyance how the fragmentation of the different vendors have, I think, held the desktop back a bit. But there has been some progress on that front too with Flatpak and things like that, so I'm still optimistic, but it's been 25 years. It's going to be another few years at least.


Considering what KDE has achieved with Plasma 5, I am really happy. I use Plasma 5 on all my machines that are capable enough to handle it. Gnome 3, consequently, has never found a place. On old machines, I use Xfce only. Xfce is, in no way, for the masses. Plasma 5 could easily be the Desktop Environment that everyone loves, if they continue to improve.

I have helped people migrate from Windows to Plasma 5. They loved it. However, getting\installing software and general system management is still not straightforward enough. We also need system management tools that are solid. No terminal gymnastics at all for the common user on distros targeted at the masses.

Flatpak is all cool, but we need core user-space applications to be standardized. LibreOffice, for example, needs to become the industry standard office-suite. How can we achieve that? People I know just do not want to move away from MS Office. Even government agencies invest heavily in MS Office. This might just be the most important impediment.

I think, the Linux ecosystem shows us what a real "free market" looks like. Lots of small players and no monopoly. People love choices and they have. You want a minimal Window Manager, you got many to choose from. You want a full-blown Desktop Environment, you are spoilt for choice. Not a bad thing in an ideal world, but...


Why every Linux related topic needs to end with KDE vs Gnome discussion. The problem is not DE fragmentation but the packaging.

Every distribution spends more time on packaging and integration testing than developing useful functionality. When developing for Linux you need to package your application in a few different formats. We also have different init systems to make it more difficult. Every distro has own store and repositories. For GUI there are different frameworks and display servers.

If Linux world standardizes around snap/flatpak, systemd and wayland then we stand a chance.


First impression is the...

DE being the first thing a common user interfaces with, it holds a very important position, in my opinion. KDE and Gnome are very different w.r.t design principles, hence the polarization and debate. KDE and Xfce are not so different. I like Gnome, but have no use for it.

I agree with all else.

Also, I think, a real "free market" would resemble Linux ecosystem. No monopoly, small players and lots of choices. However, if we wish to compete in "real world", we need to do something similar to what you say.


Linux has hundreds of DEs to suit all tastes and styles, yet it has always languished in a distant third place.

Consider the possibility that your idea of what "the common user" wants is something you made up.


Most non-technical users aren't going to remain interested long enough to try even a few desktop environments. The first one they use will leave a strong impression.

I would argue that having a choice between hundreds of DE's is a bad thing for the more casual user.


Please stop this. "Casual user", "average user", "normal user", it's all just so much bullshit people tell themselves to feel superior. Stop creating this strawman and using it as an excuse for why things have to keep sucking!


ChromeOS and Android adoption prove otherwise.

The large majority of their users don't even know what kernel is running the runtimes of their apps.

One of the reasons of netbooks failure, besides the cheap Windows XP licenses move from MS, was that each netbook Linux was its own little silo.


If we standardize we also have a central point of failure and a lack of variety/choice for users. At this point, in what way would using a GNU/Linux system be different than using an OSX or Windows system?


Because discussions are good? :-P

The big players tend to be compared, isn't that reasonable?


I bet Linux for the desktop will actually start when Microsoft ports Chromium-based Edge and Office apps to Linux for specialty users, primarily developers. I mean, if you want developer mindshare and apparently improving or replacing cmd.exe will never happen... it’s not like these proprietary apps aren’t possible to ship for Linux to run at least as well as they do on an Android phone...


Those proprietary apps on Android have two things working for them that GNU/Linux lacks.

In spite of piracy, a large set of population actually pays for them, versus bashing the authors for considering to charge money for their work while suggesting free alternatives.

A full stack framework that in spite of OEM customization is relatively standard across all devices with an Android label on them.


there has been some progress on that front too with Flatpak

Don’t worry Linus, with Snaps and AppImages we’re hard at work fragmenting that front too!


Hahaha YMMD! :-D


Isn’t flatpak like npm where it just flat stores all the libraries? So it’s a huge waste of space ?


There's been a continual oscillation between bring-your-dependencies (ranging from statically compiled executables, to composed things like docker containers, snaps, npm / ruby / php composer apps, etc.) and use-the-systemwide-stuff (like most linux apps).

The tradeoff is pretty obvious: on the one hand you get full control over what code you bring in, but you also get full responsibility of keeping it up to date and it does indeed take up more space. On the other hand you get systemwide updates, but also unpredictability which can break things.

In the former, you have a lot of work to do yourself to maintain your ecosystem between releases. In the latter, a greater amount of effort and responsibility goes onto those who maintain the packages at a distribution level to ensure mutually compatible package selections. This can be done (viz. most of the top tier distributions successfully doing this with occasional problems for decades) but we keep trying new all-in-one mechanisms every few years.

I'm not convinced that there's ever going to be a final answer to this, but it is clear that storage is cheap as chips...


I don't know, I think the answer is pretty obvious: use system-wide shared libraries for things that are very common, like widget toolkits, network libs, cryptography, and other system components. Otherwise it is part of the application, not the system, and should be with the application.

The reason you don't see this on Linux is because there is no such thing as a separation between 'system' and 'application' in its culture. Consequently there has never been a "base system" to target or keep compatibility with so applications have to either target a specific version of a specific distro (waste of time) or include everything above the stable kernel ABI in their product (waste of space).

It is actually a very simple problem to solve, it just isn't one that the Linux community is interested in solving simply, so instead they invent ridiculously complicated tooling like package managers and Flatpak, introducing a bunch of unnecessary limitations and yet more parts to break and ruin your day.


Was cheap as chips. The age of flash has dawned.


Still cheap as chips. 1TB name brand flash drives can be had for under $200.


FWIK, flatpak applications can share libraries via runtimes. If you instead mean about the host system, then yes, I think they import everything from libc and up.

Edit: clarification.


I tried to use Flatpak for everything and because various Flatpak apps want their own version of Gnome I ended up with different versions of Gnome runtime installed on my system each consuming several hundred megs of disk space. At that point I decided to screw it and use Flatpak only if native repos didn't have an app I needed.


He is right with that. There is a few desktop environments that are quite good like: KDE (Plasma), Gnome (with enough options tuning & customizing) and maybe pantheon (see elementary OS)

We'd need something more smoove and better over-all integration because there is still too many shortcomings and tiny bugs that could annoy a day-to-day user that doesn't want to know anything about config files or extended menus with special settings. It should be a great thing out of the box with sane defaults. That includes the icon set as well as a decent file browser and terminal emulator experience (iTerm2 on mac is THE REFERENCE here, I wouldn't want anything less).

Next comes workflows like office /graphics /audio stuff. I like how OSX handles PDFs etc and would love to see that on linux as well.

But I'm afraid this scattering is one of the biggest enemies of adoption and maturity of open source software. There is so many OSes and tools that you often have to research stuff for hours before you start off with a shitty tool that get's the job half done and then you recognize you can start from scratch because it doesn't work as you expected.

Instead many open source projects could live up to their potential if they'd combine efforts to merge the best they did and create ONE super awesome tool.

The reallity is that very often projects are just abandoned because adoption / donations etc. are too low or the 2 maintainers are tired after years of working on a project only 100 ppl on the globe are using.

Unite! At least converge to 2 big streams: Pro-Users (the ppl making jokes about shellscripts) and People (your mom)

Just focus on delivering for these 2 groups and I think most ppl would be glad about the end product. I for myself will try to do my best in putting efforts into projects I hope will have the biggest adoption.


> Unite! At least converge to 2 big streams: Pro-Users (the ppl making jokes about shellscripts) and People (your mom)

I couldn't disagree more. As a community, developers and Linux developers in particular need to stop believing in the divide between the Glourious Developer Master Race and Ignorant Users What Think Their CD Tray Is A Cup Holder.

Linux Desktop has been pandering to this imagined notion of an "average user" for decades and it still sucks.

Here's what needs to be done: make the system simple (not easy) enough that it can be reasoned about, something that someone can look at and comprehend and build a model of its functioning so they can conform how it functions to their needs without reading C source code and 40 different text-based file formats and out of date man pages. Stop being a Rube Goldberg machine of interconnected disparate components.


I have a macbook pro at work since last year, OS X is slow as hell and iTerm2 (my main app usage) is very sluggy when you are used to Konsole (KDE terminal). But yes the laptop is pretty.

I'm impatient to be able to update my laptop for a X1 carbon and a speedy KDE.


That is strange but I heard about apple products and software getting worse every day. (users seem to become beta-testers)

I have an old MBP (the one that has all the ports you'd including a standard HDMI, don't know what year) and never upgraded beyond osx mavericks - it runs fine until today.

A friend told me they use new MBPs @work and they seem to break much too often. I also tried the keyboard on his working machine to find out how bad it is nowadays! I prefer the oldschool thinkpad keyboard but was surprised by the keyboard on the old MBP. The new one's on the other hand are just a tool for torturing users who like to write much.

iTerm2 was my favourite because of the features (like intuitive tmux usage without learning it's CLI-foo). I also like how it is integrated with the rest of the desktop (searching, copy&paste, url-clicking etc.) but can't really compare to the default terminal emulator as I haven't used it much after trying iTerm2 when it was recommended by the whole world.


Have you considered just using the OSX Terminal?

It's been very good since like at least OSX 10.5 (when they added tabs) or so.

I always wonder why people use other terminals (but don't seem to use any of it's special features).

I'd be curious to see if you think it performs better.


> There is a few desktop environments that are quite good ... Unite!

This is happening. The web is becoming the de-facto desktop environment.


Oh god no. It's one step forward, ten steps back.

Take a trivial example of desktop application integration, which is with us since Windows 3.11 - common file and printing dialog. Nothing standard like that on the web. Most web applications won't let you even export the data in editable format.

UI - horrible. Desktops had standard menus and shortcuts since 1980s. In fact, many SPAs are trying to invent their own desktop.

I could go on.. the fragmentation of web apps is so much larger than fragmentation of desktop apps.


Funny story.

So a few weeks ago, I'm sitting in the waiting room at my doctor's offices. And they seem to be running late. And the waiting room is filling up. So I go to the receptionist's window, and ask what's up.

Turns out that their ISP was having problems. And they were totally dead in the water. They had no local backup. And knew nothing. No patient data. No schedule. Nothing.

So hey, I just left, and rescheduled. But damn.


Thing is, the Web as de-facto desktop environment is one reason less to even care about Linux's existence.

A browser could run bare metal for what I care.


How exactly do you think you'll get your drivers working?

What do you do when the internet is crashed / your app is down?

I like the data-sync & backup part of web apps - you don't have to save etc. and your whole machine can just fall into a river but you still have your work. But I'd rather do this with a seperate solution and keep stuff on my PC so I can access it without a network.


The same way that webOS, ChromeOS, FireFox OS and a couple of other attempts do.

By providing a minimal kernel for juggling browser instances, local file system and browser specific "native" APIs.

Welcome back to the timesharing days, just with prettier UIs.


Um...well...can't say I'm overjoyed about that - the Good Old Times, whenever invoked, were rarely good if you happened to actually live in them^$#%$#NO CARRIER


I should have placed a sarcastic remark.

It is however the pace how things are going, even most mobile apps are modern versions of those 2-tier apps during the 90s.


I don't know... Besides big and tech savvy companies like the big G no one seems to be able to create a decent UI with consistent UX as you'd expect it from a good desktop environment.

I prefer using apps with ncurses interface to many web apps. Also the standards are not good enough for my demands. I've worked in web development and I know how tedious it is to re-implement the wheel for each and every user (at least it feels that way) because you have to ensure everything is rendered and works on 10 different browsers (different versions of course because who want's to update?). Also you have the devices (I don't even want to think about that) etc...

It is nearly 2k19 and still centering stuff with CSS seems to be a huge deal. That's just not very assuring for me.


Yes, he essentially says we're on our way there but we have a few more years or so to go. The headline hyperbolic click bait.


It's a short(ish) and somewhat thought-provoking read, although it has the questionable value of most analogies, and is light on details; however, I enjoyed this line:

> One of my biggest personal fears is working in the wrong field to achieve the goal I care about.


I know nothing about Citadel, but it may be unfair to generalize from one experience. For example, that particular group may be bad but other groups may be fine and don't know about the bad group.

If you're feeling particularly bold, the ideal is to report that bad group to Citadel. Then, after sufficient time for an internal review, follow up and ask if anything has been done, and if not, then conclude your generalization.

With that said, I wouldn't be surprised if our bayesian priors about the morality of financial institutions are correct, but it helps to empirically validate this and publish findings so that, over time, these institutions are validly, publicly shamed.



I hear ya. I've been a programmer for over a decade and even though I'm making great money and it's intellectually stimulating, I'm essentially giving it all up and diving into biotech (studying now for an M.S. in Bio).

The headwinds are persistent: credentialism and a glut of PhDs that are way more qualified than me. I'm constantly meeting bio people going the other way - into tech.

Despite that, I'm stubbornly going to keep going because like you say, the potential of Biotech to reduce human suffering seems so much greater.

I worry about becoming pigeonholed by my background, or falling into the many pitfalls you point out like screwed up incentives or narrow funding paths. More than anything, I want to work on something important and meaningful, like a disease.

Roughly speaking, my current idea is to learn the basics which I'm in the process of doing now [1], get basic credentials, and then find data on causes of suffering, take the first derivative, sort descending, and go. I know it's naive, but it seems like the right thing to do.

[1] https://freeradical13.github.io/


WINAMP, WINamp, winamp... it really...

Still works. Winamp is still my main music player (with classic skin, of course). I run it through Wine on Fedora 28. Some things cause crashes, but the only things I really care about are MilkDrop, the media library, and playlists and they all work fine. I've never been able to find something that comes close to MilkDrop.

Here are my installation instructions (after installing Wine) although I haven't tried them fresh in a few years (Fedora and Wine upgrades haven't screwed anything up):

  $ wget https://raw.githubusercontent.com/Winetricks/winetricks/master/src/winetricks
  $ chmod +x winetricks
  $ ./winetricks -q directmusic directplay directx9 gdiplus ie8 mfc42 wmp10 windowmanagerdecorated=n
  $ ./winetricks winamp
  Launch Winamp
  To get rid of some weird font issues: Right click Winamp > Options > Preferences... > General Preferences > Playlist > select Use font: MS Sans Serif
  To fix bug https://bugs.winehq.org/show_bug.cgi?id=12060: Winamp > Options > Preferences... > Plug-ins > Visualization > MilkDrop v2.25c > Configure > WINDOWED settings > Uncheck Integrate with winamp skin


You can get MilkDrop 2 on Kodi, which turned out to be a just fine replacement for winamp once you strip all the other nonsense away.


I used to do this, too, but it eats up a lot of CPU for just a music player. For Linux music, I've moved onto cmus.


If we suppose that his conclusion is using boolean logic, then what you're saying is a strawman because of his last claim; namely, protobufs are bad if "[...] && !Google":

> They're clearly written by amateurs, unbelievably ad-hoc, mired in gotchas, tricky to compile, and solve a problem that nobody but Google really has.

This dovetails with other arguments that I've seen recently that are becoming more frequent:

Have we entered a new world where the lessons of companies working at massive scales are not only generally superfluous for smaller scales, but are actively harmful?


> Have we entered a new world where the lessons of companies working at massive scales are not only generally superfluous for smaller scales, but are actively harmful?

I think so, yeah.

Microservices turn out to have a lot of negative consequences, and their positives work best when you have dozens or hundreds of developers. If you've got a handful of developers... not so great.


Your argument really depends on the use case. If you have discrete, well-factored operations microservices can make even small systems easier to deploy and manage. For example, you do the front-end API in Java (easier to build secure, debuggable systems with good RDBMS access) and backend analytic services in Python (easier to scrape data out of XML/JSON). Splitting them up into 2 or more microservices can simplify development, CI/CD, and deployment.

Whenever I see large numbers of microservices anywhere my null hypothesis is that some organizational disfunction is leading teams to factor applications into unnecessarily small pieces.


No. But you have to examine use cases carefully including the assumptions. I think this has always been the case but people (in my experience at least) get a little dazzled by the massive scale of companies like Facebook and try to apply their solutions to problems for which they are simply not applicable.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: