When I was at unnamed major financial institution, we were ordered to stop using WhatsApp, but it had nothing to do with security and everything to do with avoiding even the possibility of the appearance of backroom dealing or production avoidance in the event of subpoena. Maybe the truth has more to do with that, or maybe not, what do I know, who are all you people anyway, and why am I posting here?
The UK conservative government ran a lot of meetings on whatsapp because they believed it was secure and unarchived, i.e. could escape the normal retention requirements. Of course what happened is that once the chat got large enough and the government fractious enough, people started leaking messages by screenshot.
When trying to avoid subpoenas of data on the device itself, it's important to frequently "lose" the phone with the messages on.
Yeah it's the de facto communication method for personal communication. I have never worked at a company where people use WhatsApp to communicate. It's always Slack or Teams or Mattermost.
Heh. I have a friend here in the US. His father passed away in his home country. No will. The whole family needed to show up in court for probate, but he could not travel at that time.
The court: "No problem, just join the session on video using WhatsApp"
It sounds like the court they are referring to is in the "home country". The friend whose father passed is in the US but the "home country" is where the father passed.
Right? If you use snap chat for meetings emojis, "dude", "bro", "like then he said, and i was like..." etc would be the communication denominator. It'd be fun, silly, and stupid
i feel the same way about so many government departments switching to X as a primary public communications platform instead of... you know, the open web (with distribution to downstream closed platforms), as they always have. it just reeks of unseriousness.
> nothing to do with security and everything to do with avoiding even the possibility of the appearance of backroom dealing or production avoidance in the event of subpoena
But that is a concern of information security.
Compliance is often part of this calculus, and many on this forum get wrapped around the axle thinking it's always about cryptography or something. Encryption is only a small part of the broader practice of information security.
Makes sense, there are lots of requirements for communication retention in financial institutions. If I recall the phone lines are permanently recorded on trading desks by regulators so if anything does happen they have all the info... it's why socializing in person is such a big part of being a trader.
Not while I was there, anyway. The corporate image was so locked down that only named binaries would run, and Internet access was heavily filtered and MITM'd for inspection/retention. We didn't even have a shitposting channel. All the juicy stuff happened over the phone, because most people weren't recorded apart from traders and those adjacent to them (and you'd know if they were recorded because of the IVR announcement preceding their join).
Sure, but DC relative to what? Can't be ground since that connection isn't there, so that leaves neutral.
I've got half an alligator clip on my workbench, the other half disappeared when I connected it to the floating ground of a 5VDC system. It just so happens the ground was floating on top of 120VAC. The ATmega didn't care, it only ever saw 5V between its Vcc and Vee, however once I made the mistaken of connecting its "ground" to an actual ground sparks flew.
The DC side is fully isolated except through a capacitor that is there to reduce EMI interference and is specifically built to "fail safe", except for the cheapest no-name imported power supplies. (https://www.pcbaaa.com/y1-capacitors-function-application-an...)
That's not what the Y capacitor is doing here. All Y means is "high voltage withstand, will fail open, and we have receipts". Here, it's not line-to-ground, but primary-to-secondary.
You'd use X for L to L/N, Y is for N to Ground. Those isolated power supplies are definitley grounded, definitely not what Apple is using, and will probably shut themselves off if they lose their ground connection (or at least signal a fault to the operator)
You're also confusing the Y designation, which relates to the capacitor properties, to the application. Not all Y capacitors are used for line-to-ground applications. However any application where a failure to a closed state would result in a shock hazard must use Y-rated capacitors.
Neither is the output of an isolated power supply. That's the whole point. There is an enormous amount of resistance between the output side and the household wiring in the input side, just like the battery scenario. The difference between gigaohms (like you might see in between the windings of the SMPS transformer) and exaohms or whatever the airgap between the battery and the nearest outlet is is negligible for any practical purposes.
You may see AC leakage due to the capacitance between the windings and also that of the Y capacitor (this leakage is the source of the sensation in the article), but that's also tiny and it won't be blowing up any crocodile clips. If you did have that happen, your power supply wasn't isolated somehow.
> "A sturdy, thoughtful, cute design that just can't compete in its price range."
People will pay untold thousands for a Mac, but God forbid when a PC manufacturer charges more than $599 for a laptop. If you're whining about the price, Framework isn't made for you. Go buy that Acer that you really want. The Framework is Sam Vimes' expensive boots that are made to last[1], and I've happily paid in full to get a pair.
I really don't understand this argument about price. It seems extremely competitive on price to me. Am I crazy or am I really seeing 48 GB and 2 TB for $1500? For $1500 you get a 16 GB 512 GB macbook air.
This is a key part of our product value prop. Our memory and storage upgrade pricing is much lower than most other laptop makers, and you can find your own on the open market for even less. Other laptop makers can preserve their overall margin by overcharging on those upgrades, which lets them price their base SKUs more aggressively. We accepted the tradeoff of not gouging on upgrades.
I got my wife an entire-ass Framework 13 7840U /and/ put 32GB RAM and a 2TB SSD in it for less than the cost of the uplift to go from base RAM to 32GB and base SSD to 2TB from Apple at time of preorder. That was the day I stopped being an Apple customer. Maybe for the $300 Walmart laptop folks it's too expensive, but hardly for Mac refugees.
MacBook Air and MacBook Pro actually have very competitive pricing, even if you take into account the expensive upgrades. I'd buy the Windows/Linux equivalent at the same price in a heartbeat.
> People will pay untold thousands for a Mac, but God forbid when a PC manufacturer charges more than $599 for a laptop.
The article compares the FL12 to laptops of the same price range, including other framework laptops to note that it falls short.
The FL12 has worse performances and battery life than an M1 Air, for more than an M4.
The point of the article is that the 12 should either be a lot less expensive or it should be a lot better. It's not whatever nonsense you're dreaming of.
The core philosophy of Framework is repairability and modularity. Yes, you are paying extra for those things, and so people who do not value them, should probably not buy Framework. These comments are full of the old cliche of judging a fish in a tree climbing contest.
Repairabilty and modularity come with tradeoffs. Not everyone is going to value those tradeoffs and therefore shouldn't buy a laptop where those are the priority. But some people do value those things, and telling them to "get a MacBook" is just silly.
You can repair a Mac by handing it (and possibly your wallet) to Apple and letting them replace entire large subsystems to remedy the issue and pair the new parts. A few years back (pre-Apple Silicon) I got a new top case, keyboard, battery, and trackpad because the button in the trackpad had failed. Pretty good deal on a laptop that was nearly 3 years old, in fairness.
To repair (or upgrade) a Framework, you buy the part and install it. That's worth something to me!
Incidentally, I also have a last-gen ThinkPad P14s Gen 5 AMD and it's a flimsy POS. Already needed a new motherboard and battery and spent three weeks sitting at the service center while they rounded up the parts. Wish I'd bought another Framework 13.
My guess, based on what's been found about somewhat better cognitive outcomes in aging in people who make an effort to remain fit and stimulated[1], is that we could see slightly worse cognitive outcomes in people that spent their lives steering an LLM to do the "cognitive cardio" rather than putting in the miles themselves.
On the other hand, maybe abacuses and written language won't be the downfall of humanity, destroying our ability to hold numbers and memorize long passages of narrative, after all. Who's to know? The future is hard to see.
> On the other hand, maybe abacuses and written language won't be the downfall of humanity, destroying our ability to hold numbers and memorize long passages of narrative, after all
The abacus, the calculator and the book don't randomly get stuff wrong in 15% of cases though. We rely on calculators because they eclipse us in _any_ calculation, we rely on books because they store the stories permanently, but if I use chatGPT to write all my easy SQL I will still have to write the hard SQL by hand because it cannot do that properly (and if I rely on chatGPT to much I will not be able to do that either because of attrition in my brain).
We'll definitely need people who can do the hard stuff still!
If we're lucky, the tendency toward random hallucinations will force an upswing in functional skepticism and and lots of mental effort spent verifying outputs! If not, then we're probably cooked.
Maybe a ray of light, even coming from a serious skeptic of generative AI: I've been impressed at what someone with little ability to write code or inclination to learn can accomplish with something like Cursor to crank out little tools and widgets to improve their daily life, similar to how we still need skilled machinists even while 3D printing has enabled greater democratization of object production. LLMs: a 3D printer for software. It may not be great, but if it works, whatever.
> The abacus, the calculator and the book don't randomly get stuff wrong in 15% of cases though.
Yeah, you'd think that a profession that talks about stuff like "NP-Hard" and "unit tests" would be more sensitive to the distinction between (A) the work of providing a result versus (B) the amount of work necessary to verify it.
I distrust that rationale, because even if generation>=verification, it depends on the error-rate and impact. Wiring up a condemned building with demolition charges might take longer than a casual independent review...
Truly perfect code verification can easily cost more than writing it, especially when it's not just the new lines themselves, but the change's effect on a big existing system.
Thats not what I meant tho, the point about books is that they store information reliably. If I write something down, within most reasonable settings it will still be the same text when I read it back. That means if I write something down instead of remembering it, the writing will outperform me in storing this information. Same with the calculator, the calculator will always perform at least as good as me at arithmetics. There is no calculation on which the calculator can randomly fail, leading me to do it by hand, so I don't need to retain the skill of doing it by hand. The same can not be said about LLMs and that is the issue.
Sure, but also that's not what (generative) AI are for.
If you want reliable list of facts, use (or tell the AI to use) a search engine and a file system… just then you need whatever system you use to be able to tell if your search for "Jesus" was in the Christian missionary sense, or the "ICE arrested Jesús Cruz" sense, or you wanted the poem in the Whitehouse v Lemon case, or if you were just swearing.
If you can't tell which you wanted, the books being constant doesn't help.
> There is no calculation on which the calculator can randomly fail, leading me to do it by hand, so I don't need to retain the skill of doing it by hand.
I've seen it happen, e.g. on my phone the other week, because Apple's note-based calculator strips unrecognised symbols, which means when you copy-paste from a place where "." is the decimal separator, while your system settings say you use "," as a decimal separator, it gives an answer off by some power of ten… but I've also just today discovered that doing this the other way around on macOS (system setting "." as separator) it strips the stuff before the decimal.
Just in case my writing is unclear, here's a specific example, *with the exact same note* (as in, it's auto-shared by iCloud and recomputing the answer locally) on macOS (where "." is my separator):
123,45 / 2 = 22.5
123.45 / 2 = 61.725
and iOS (","):
123,45 / 2 = 61,725
123.45 / 2 = 6.172,5
And that's without data entry failure. I've had to explain to a cashier that if I have three items that are each less than £1, the total cannot possibly be more than £3.
I'm sure they will continue to allow disabling transparency in accessibility settings, given that the current OS version has transparency throughout which can already be so disabled.
And here I thought the wave of the future was generative AI, which damn near requires Nvidia to even function. Sure can't wait for RedHat to deprecate and nuke and blacklist and hellban all Nvidia capability!
Indeed. I've seen tons of things that specifically require nvidia and support nothing else - more and more in the last couple of years. Some proprietary games don't support anything but the nvidia proprietary drivers on Linux.
The flexibility afforded by sideloading, which allows that that an Android phone is still for the most part a pocket-sized computer that can operate in a mode not intended by its creators (as opposed to a restricted consumption appliance like the iPhone) is what has kept me on the platform for 16 years and counting. If they take that away, then I really don't see a compelling difference between the two platforms.
Android has been getting markedly more flaky for me ON MULTIPLE GOOGLE PIXEL DEVICES since 2018. My current Pixel 8a on Android 15 regularly has the underlying UI controls (separate from the launcher) crash and force me to restart if I want to use the app overview switcher since day 1. I also have no app overview button in the stock Android calculator since Android 14, the shipped OS, so if I want to switch between a calculation and another app I must first return to the home screen. Wasn't like this in previous releases! Furthermore, the day/date is routinely cut off in the statusbar and its pulldown. This product passed multiple reviews and 2 major OS releases with these (and many other) obvious and irritating bugs and shows no signs of improvement. If they left these holes in the surface, I can only imagine what's underneath. It's ridiculous, but I guess we're cranking out complexity at a rate that exceeds our ability to manage it (or our ability to manufacture new fucks at a rate exceeding their consumption).
If Purism is shopping for new users, all they would eventually need to do is not get worse at a rate as fast as Android, or more expensive at a rate as fast as iOS devices. Based on what I've seen from them so far...they're not at that point yet: meager specifications, high prices. I will continue to cling to my Android device, but I'll cheer them on from the sidelines.
I didn't know apple supported rooted iphones (I do not keep up with apple). Does apple at least provide a warning about invalidating warranties or whatever?
Been there, done that. It breaks too many things, prevents OTA updates, and some apps just won't run if they detect you've screwed with iOS. I ended up putting my iPad Pro and iPhone (which I have but only use when I need something they offer exclusively) back on official firmware.
I know a lot about this! I've been on Twitter for 18 years, post frequently, and have about 5-10 people who read my posts based on the metrics. (Significantly more followers, but they're all fake/bots/dormant). You're either posting for the love of posting/keeping a journal/getting ideas out of your head, or you're posting like it's going to the gym because you want to be an influencer. Having a searchable 18-year database of my thoughts has been helpful to me on many occasions. I also used the dataset to fine-tune an LLM to shitpost like me. Recreational narcissism!
A human would only make up an answer like the ones in the article if it were a compulsive liar. A human would ideally say "I don't know" or at worst employ the "I'll confirm that and circle back" corpspeak evasion.
reply