I get more done by writing the stupid code, and fixing it, than junking the old code... but every now and then I can see clearly a structure for a rewrite, and then I rewrite, but its rare.
There are also people telling you the earth is flat, and 30 years of experience can be compressed into a 4 minute you tube video. Even if a chisel could spy on me, it becomes dull with use, where as AI may become sharper with use, it still cannot distinguish which idiot is operating it. AI is just for people to learn prompting, which is an art, like google searching. It still cannot fathom "taste." or a large host of other types of nuances, that again, only come with experience and enculturation.
You are the master of understatement. I just spent 5+ Hours getting an emulator to just work. back and forth with the AI required me to be cognizant of the direction I was going, very cognizant. After It finally worked... the clean up was huge. at least 15 broken images, 100s of scratch files.
"An sNaN (signaling Not-a-Number) is a special floating-point value that is designed to trigger a hardware trap or exception when it is used in an arithmetic operation. This differs from a qNaN (quiet Not-a-Number), which propagates through calculations without causing an immediate exception. Handling sNaNs requires a more deliberate approach that involves enabling floating-point traps and writing a custom trap handler."
The trick is to get your ALUs to do some of the math for you. Oh I miss the days of the 68020 fast barrel shifter and the 68030 byte smears. Tricky stuff lost to the silicon/sands of time.
Compromises. We had BCD for finance, binary for games, and floating point for math. I wrote a sample 'make change' using floating, BCD, and integer( normalizing by multiplying by 100). The integer ripped thru it, but surprisingly BCD kept up with FP, and with compiler optimizations, in certain edge cases and unit tests was significantly faster.
You get surprising things with common place problems.
So as the sum grows with the square, the error function grows in the same magnitude. A very interesting excersise. I'll get an AI to choke and puke on that.
I seriously do NOT want the two plus hours back I spent diddling ( 1 bit not ) all those bits. I first learned it in binary on an HP, and then had to learn it on an IBM mainframe, and then got an 8087, and it was so incredibly fast, but error crept in for Fourier transforms. Started using extended double to keep the errors at bay. Hilbert spaces here we came.
The killer app was not Lotus 1-2-3 v2, but Turbo Pascal w/ 8087 support. It screamed through tensors and 3D spaces, places we only saw on plotters.
It was not until I got a G3 and used Graphing calculator that I could explore sombrero functions of increasing frequency.
Maybe, but I don't know. Although I would like to channel as many snarky remarks as I could, to be more constructive, I would use the IDK model, as I have with programming questions and use the psychotic one for questions like "are we in a simulation?" And "Yes, I would like fries with that and a large orange drink."
reply