> I hate and I love. Why I should do this, perhaps you may ask... I know not, but I feel it done to me, and I am wracked.
Our multidimensional beings are being assailed by at least three or four other intelligent agents that are able to pose as our own thoughts and feelings. That is the Sufi explanation for the war that is incessantly going on in our headspace and heartspace.
The spiritual path of love is a process of literally "enlightening" our soul's heart so that we are no longer susceptible to its influence via those signals/messages.
We can think our own thoughts, but to do so we must fend off the foreign invaders, and they are relentless; that's part of what makes concentration so difficult. Their success at convincing people to be selfish instead of compassionate is obvious to anyone seeing the trajectory of all the Earth's peoples, cultures, societies, and religions.
Our ignorance of the fact that this is our situation is why we are fighting each other instead of fighting that coordinated onslaught within ourselves.
"Shaytan (the devil's Djinns) see us from where we see it not." --Qur'an
This is all part of the nature of our 3-space having multiple overlapping vibrational dimensions. (That is the import of yesterdays's needlessly belligerent conversation people had with me on yesterday's physics article -- https://news.ycombinator.com/item?id=43603256)
"The greatest trick the devil ever pulled was convincing the world that it didn't exist." --The Usual Suspects
Most people will violently deny that this is our baseline human situation because they believe that internal denying and lying force inherent at the heart of this most insidious and pervasive truth in the history of mankind.
Compassion is its enemy; it lives to create strife and misery for the human beings because it hates us and is the source of hate, being its stock in trade. It seeks to destroy and create lies, brutality, and the selfishness of our ego.
"There is nothing more important than compassion, and the truth is its only equal."
> Our multidimensional beings are being assailed by at least three or four other intelligent agents that are able to pose as our own thoughts and feelings.
Contrary opinion - no they're not, get a hold of yourself.
In the confession of The Golden State Killer, he said that he would feel a force enter his being and do the raping and murdering. He also said that when he got older he was strong enough to resist it.
A lot of people lob ad homs at me and call me names and deny what I say here, but not a single one of you can explain our tragic human situation.
We can engineer fantastic buildings, create astounding works of art, perform the most incredible feats on the soccer pitch, and yet racism, poverty, cruelty, child porn and sexual abuse, oppression, and hatred remains rampant.
From my perspective of compassion, without asking anything from anyone here, I explain our situation to boos and unhelpful naysayers.
I am quite ahold of myself, my family loves me and I am at peace and happy. Yesterday my antics on the soccer pitch made my family laugh until they ached. We are poor but have our sustenance and live within our Creator's love.
As Eugene Parker said, "Well, we'll see who falls flat." The Parker Solar Probe is now orbiting the sun, doing its science, a marvel of engineering. And an evil, hateful bastard put a bullet in the servant of love Dr. Martin Luther King, Jr.'s head 57 years ago, simply because he claimed that Black folks were human beings.
The truth is undefeatable, though we can be killed by the hateful fools of this world. I stand with compassion and truth.
> In the confession of The Golden State Killer, he said that he would feel a force enter his being and do the raping and murdering.
In a modern mental health context, we generally call this mental illness.
Does describing it as some "other intelligent agent" have any meaningful explanatory power? People similarly talk about devils and angels, but such talk hasn't led to effective ways of dealing with such issues.
> not a single one of you can explain our tragic human situation.
We're evolved animals, and far from perfectly rational. Do you see some mystery needing explanation? The existence of good and bad impulses and behavior is hardly some sort of mystery.
Indeed. And congrats on your daughter's craftiness and how it intersects with math.
Our daughter is not so much into the pure math side but loves to do amigurumi, which is really applied 3D modelling. A craft show she wants to do later in the year doesn't allow the use of other people's models, so she is having to design her own. It's so very impressive, and she gets so much joy from seeing kids really, really want her work, as they do. It's math, modelling, color matching design, and understanding the kinds of threads all rolled up into one, so to speak :-)
All his ideas are fantastic, and are obviously the result of long experience in a seasoned and highly successful project. He is sharing techniques that simply work for large, complex codebases. Ignore them at your peril!
Specifically, though, these sections are related, in my experience:
> Avoid "bad" functions
> Buffer functions
> Parsing functions
> Monitor memory function use
These related aspects are why I tend to wrap many library functions that I use (in any language environment) with my own wrapper function, even if it's to just localize their use into one single entry/use point. That allows me to have one way that I use the function, thereby giving my code a place to not only place all best practices for its use, but to allow me to update those best practices in one single place for the entire codebase. And it is especially helpful if I want to simply rewrite the code itself to, for example, never use scanf, which I determined was a necessary strategy many, many moons ago.
Now, when a single function needs to accomodate different use cases and doing such separate kinds of logic would incur too much logical or runtime cost, a separate wrapper can be added, but if the additional wrappers can utilize the cornerstone wrapper, that is the best, if feasible. Of course, all these wrappers should be located in the same chunk of code.
For C, especially, wrapper functions also allow me to have my own naming convention over top of the standard library's terse names (without using macros, because they're to be avoided). That makes it easier for me to remember its name, thereby further reducing cognitive load.
The reality is that we spend FAR more time reading code than writing it. That is why readability is far more important than clever, line saving constructs.
The key to further minimizing the mental load of reacquainting yourself with older existing code is to decide on a set of code patterns and then be fastidious in using them.
And then, if you want to want to be able to easily write a parser for your own code (without every detail in the spec), it's even more important.
And now that I have read TFA, I see he wrote:
> We have tooling that verify basic code style compliance.
His experience and dilligence has led him to the mountaintop, that being we must make ourselves mere cogs in a larger machine, self-limiting ourselves for the greater good of our future workload and production quality.
> The reality is that we spend FAR more time reading code than writing it. That is why readability is far more important than clever, line saving constructs.
In JS sometimes chain two or three inline-arrow-functions specifically for readability. When you read code, you often search for the needle of "the real thing" in a haystack of data formatting, API response prepping, localization, exception handling etc.
Sometimes those shorthand constructs help me to skip the not-so-relevant parts instead of mentally climbing down and up every sort and rename function.
That being said, I would not want this sentiment formalized in code guidelines :) And JS is not C except both have curly braces.
> That being said, I would not want this sentiment formalized in code guidelines :)
Surely. I'm all for code formatting standards as long as they're MY code formatting standards :-)
Ideally, I'd like the IDE to format the code to the user/programmer's style on open, but save the series of tokens to the code database in a formatting-agnostic fashion.
Then we could each have our own style but still have a consistent codebase.
And, I should add that my formatting conventions have gotten more extreme and persnickety over the years, and I now put spaces on both sides of my commas, because they're a separate token and are not a part of the expression on either side of it. I did this purely for readability, but I have NEVER seen anyone do that in all my decades on the internet reading code and working on large codebases. But I really like how spacing it out separates the expression information from the structural information.
It also helps me deal with my jettisoning code color formatting, as, as useful as I've found it in the past, I don't want to deal with having to import/set all that environmental stuff in new environments. So, I just use bland vi with no intelligence, pushing those UI bells and whistles out of it into my code formatting.
And, I fully endorse whatever it takes for you to deal with JS, as I have loathed it since it appeared on the scene, but that's just me being an old-school C guy.
Could you give an example of "clever" (bad) vs "simple" (good)?
In my experience C has a lot of simple grammar, a commonly-held simple (wrong) execution model, and a lot more complexity lurking underneath where it can't be so easily seen.
Abstraction is necessary to handle scale. If you have painstakingly arrived at a working solution for a complex problem like say locking, you want to be able to package it up and use it throughout your codebase. C lacks mechanisms to do this apart from using its incredibly brittle macro facility.
Ada has built-in constructs for concurrency, with contracts, and there is formal verification in a subset of Ada named SPARK, so Ada / SPARK is pretty good.
> C lacks mechanisms to do this apart from using its incredibly brittle macro facility.
We programmers are the ultimate abstraction mechanism, and refining our techniques in pattern design and implementation in a codebase is our highest form of art. The list of patterns in the Gang-of-Four's "Design Patterns" are not as interesting as its first 50 pages, which are seminal.
From the organization of files in a project, to organization of projects, to class structure and use, to function design, to debug output, to variable naming as per scope, to commandline argument specification, to parsing, it's nothing but patterns upon patterns.
You're either doing patterns or you're doing one-offs, and one-offs are more brittle than C macros, are hard to comprehend later, and when you fix a bug in one, you've only fixed one bug, not an entire class of bugs.
Abstraction is the essense of programming, and abstraction is just pattern design and implementation in a codebase, the design of a functional block and how it's consumed over time.
The layering of abstractions is the most fundamental perspective on a codebase. They not only handle scale, they make or break correctness, ease of malleability, bug triage, performance, and comprehendability -- I'm sure I could find more.
The design of the layering of abstractions is the everything of a codebase.
The success of C's ability to let programmers create layers of abstractions is why C is the foundational language of the OS I'm using, as well as the browser I'm typing this message in. I'm guessing you are, too, and, while I could be wrong, it's not likely. And not a segfault in sight. The scale of Unix is unmatched.
> The success of C's ability to let programmers create layers of abstractions is why C is the foundational language of the OS I'm using, as well as the browser I'm typing this message in.
What browser are you using that has any appreciable amount of C in it? They all went C++ ages ago because it has much better abstraction and organization capabilities.
That's a fair point that I hadn't considered. I was developing C+objects as C++ was first being released in the mid-90s, and then using Borland's C++ compiler in the early 2000s, but never really thought about it as anything more than what its name implies: "C with some more abstractions on top of it".
Thank you for the correction, but I consider C++ to be just a set of abstractions built upon C, and, if you think about it, and none of those structures are separate from C, but merely overlaid upon it. I mean it is still just ints, floats, and pointers grouped using fancier abstractions. Yes, they're often nicer and much easier to use than what I had to do to write a GUI on top of extended DOS, but it's all just wrappers around C, IMO.
C++ is very definitely not just wrappers around C and it's pretty ridiculous to frame it like that. Or if you want to insist on that, then C doesn't exist, either, as it's just a few small abstractions over assembly.
> The success of C's ability to let programmers create layers of abstractions
You wrote several entirely valid paragraphs about how important abstractions are and then put this at the end, when C has been eclipsed by 40+ years of better abstractions.
Because programmers are creating the abstractions, not the programming language.
And there is no OS I'm aware of that will threaten Unix's dominance any time soon.
I'm not against it, but C's being so close to what microprocessors actually do seems to be story of of its success, now that I think about it.
I personally haven't written in C for more than a half-decade, preferring Python, but everything I do in Python could be done in C, with enough scaffolding. In fact, Python is written in C, which makes sense because C++ would introduce too many byproducts to the tightness required of it.
I was programming C using my own object structuring abstractions as C++ was being developed and released. It can be done, and done well (as evidenced by curl), but it just requires more care, which comes down to the abstractions we choose.
So, I would say "eclipsed" is a bit strong a sentiment, especially given our newly favorite programming langauges are running on OSes written in C.
If I had my druthers, I'd like everything to be F# with native compilation (i.e. not running using the .NET JIT), or OCaml with a more C-ish style of variable instantiation and no GC. But the impedance mismatch likely makes F# a poor choice for producing the kinds of precise abstractions needed for an OS, but that's just my opinion. Regardless, the code that runs runs via the microprocessor so the question really is, "What kinds of programming abstractions produce code that runs well on a microprocessor."
I've never thought of this before, thanks for the great question.
> And there is no OS I'm aware of that will threaten Unix's dominance any time soon.
Depends on the point of view, and what computing models we are talking about.
While iDevices and Android have UNIX like bottom layer, the userspace has nothing to do with UNIX, developed in a mix of Objective-C, Swift, Java, Kotlin and C++.
There is no UNIX per se on game consoles, and even on Orbit OS, there is little of it left.
The famous Arduino sketches are written in C++ not C.
Windows, dominant in games industry to the point Valve failed to attract developers to write GNU/Linux games, and had to come up with Proton instead, it is not UNIX, the old style Win32 C code has been practically frozen since Windows XP, with very few additions, as since Windows Vista it became heavily based on C++ and .NET code.
macOS while being UNIX certified, the userspace that Apple cares about, or NeXT before the acquisition, has very little to do with UNIX and C, rather Objective-C, C++ and Swift.
On the cloud native space, with managed runtimes on application containers or serverless, the exact nature of the underlying kernel or type 1 hypervisor is mostly irrelevant for application developers.
> I'd like everything to be F# with native compilation
This already works today (even with GUI applications) - just define non-unbound-reflection using replacements for printfn (2 LOC) and you're good to go: dotnet publish /p:PublishAot=true
To be clear, in .NET, both JIT runtime and ILC (IL AOT Compiler) drive the same back-end. The compiler itself is called RyuJIT but it really serves all kinds of scenarios today.
> makes F# a poor choice for producing the kinds of precise abstractions needed for an OS
You can do this in F# since it has access to all the same attributes for fine-grained memory layout and marshalling control C# does, but the experience of using C# for this is better (it is also, in general, better than using C). There are a couple areas where F# is less convenient to use than C# - it lacks C#'s lifetime analysis for refs and ref structs and its pattern matching does not work on spans and, again, is problematic with ref structs.
> there is no OS I'm aware of that will threaten Unix's dominance any time soon
True, but irrelevant?
> What kinds of programming abstractions produce code that runs well on a microprocessor
.. securely. Yes, this can be done in C-with-proofs (sel4), but the cost is rather high.
To a certain extent microprocessors have co-evolved with C because of the need to run the same code that already exists. And existing systems force new work to be done with C linkage. But the ongoing CVE pressure is never going to go away.
I'm not at all against a new model providing a more solid foundation for a new OS, but it's not going to be garbage collected, so the most popular of the newer languages make the pickings slim indeed.
> But the ongoing CVE pressure is never going to go away.
I think there are other ways to deflect or defeat that pressure, but I have no proof or work in that direction, so I really have nothing but admittedly wild ideas.
However, one potentially promising possibility in that direction is the dawn of immutable kernels, but once again, that's just an intuition on my part, and they can likely be eventually defeated, if only by weaknesses in the underlying hardware architecture, even though newer techniques such as timing attacks should be more easily detected because they rely on being massively brute force.
The question, to me, is "Can whittling away at the inherent weaknesses reduce the vulns to a level of practical invulnerability?" I'm not hopeful that that can occur but seeing the amount of work a complete reimplementation would require, it may simply be the best approach to choose from a cost-benefit analysis perspective where having far fewer bugs and vulns is more feasible than guaranteed perfection. And, once again, such perfection would require the hardware architecture be co-developed with the OS and its language to really create a bulletproof system, IMO.
"Look, forget the myths the media's created about the White House--the truth is, these are not very bright guys, and things got out of hand." --from All the President's Men
That was over 50 years ago, and now no one over a 90 IQ thinks these guys are bright.
Good point. On the Daily podcast (NYT) Goldberg was asked what was in the chat after the strikes. He said a plenty of different emojis. Goldberg's reaction was also interesting, he thought at the moment that every workplace is the same. Personally I'd expect some serious mood and attitude when there are so much at stake including people lives, but humans are humans
Groucho's quote "I'd never be a member of any club that would have me as a member."
And one of our favorite Columbo episodes (all free on Tubi) that features a murder in a very Mensa-style club, titled "The Bye-Bye Sky High I.Q. Murder Case".
Our daughter counted AK as her favorite book for years, though now I suspect she'd choose Little Women or any of the last five William Gibson books as her fav.
She's read many of these books multiple times. Of course, she's never had a reading list to do, so I'm sure it helps that she chose them on her own. She also read Langston Hughes self-collected "Best of" treasury he released towards the end of his life. And, she has just bailed on other books, like effing Wuthering Heights, ick!
Our multidimensional beings are being assailed by at least three or four other intelligent agents that are able to pose as our own thoughts and feelings. That is the Sufi explanation for the war that is incessantly going on in our headspace and heartspace.
The spiritual path of love is a process of literally "enlightening" our soul's heart so that we are no longer susceptible to its influence via those signals/messages.
We can think our own thoughts, but to do so we must fend off the foreign invaders, and they are relentless; that's part of what makes concentration so difficult. Their success at convincing people to be selfish instead of compassionate is obvious to anyone seeing the trajectory of all the Earth's peoples, cultures, societies, and religions.
Our ignorance of the fact that this is our situation is why we are fighting each other instead of fighting that coordinated onslaught within ourselves.
"Shaytan (the devil's Djinns) see us from where we see it not." --Qur'an
This is all part of the nature of our 3-space having multiple overlapping vibrational dimensions. (That is the import of yesterdays's needlessly belligerent conversation people had with me on yesterday's physics article -- https://news.ycombinator.com/item?id=43603256)
"The greatest trick the devil ever pulled was convincing the world that it didn't exist." --The Usual Suspects
Most people will violently deny that this is our baseline human situation because they believe that internal denying and lying force inherent at the heart of this most insidious and pervasive truth in the history of mankind.
Compassion is its enemy; it lives to create strife and misery for the human beings because it hates us and is the source of hate, being its stock in trade. It seeks to destroy and create lies, brutality, and the selfishness of our ego.
"There is nothing more important than compassion, and the truth is its only equal."