

It is pretty funny that C’s type system can be described pretty differently based on the speaker’s experience. The parable of the Blub language comes to mind.
It is pretty funny that C’s type system can be described pretty differently based on the speaker’s experience. The parable of the Blub language comes to mind.
Parsing is a way of “validating early”. You either get a successful parse and the program continues working on known-good data with that knowledge encoded in the type system, or you handle incorrect data as soon as it’s encountered.
I feel I gotta point out it’s a pretty funny example—email comes up so frequently as a thing that you’re recommended to neither parse nor validate, just try to send an email to the address and see if it works. If you need to know that it was received successfully, a link to click is the general method.
But “parse, don’t validate” is still a generally good idea, no matter the example used. :)
Yeah, you generally just want the same auto-stuff done as would be enforced in CI anyway.
… all the other stuff you could fix but wind up just ignoring because your team ignores it will just glare at you until you sneak it in somehow
It’s just a monoid object in a category of endofunctors, no biggie
Afaik they’re hoping to land it on nightly in 2025H1.
Between that, work on the next-generation trait solver and promoting parallel frontend, there’s some stuff to look forward in the compiler this year.
Do also note that by saying that some % of crates use unsafe, it’s not implied that 100% of the code in that crate is marked unsafe
. It could be as little as one line; it could be a whole lot; it could be well-documented and tested; it might not be. (This is part of what the talk is about.)
It’s also rather to be expected that there’s more unsafe
in embedded. As Steve Klabnik gets into in How to Do Embedded Development with Rust (GOTO 2023), it’s used when you e.g. want to set a certain memory address to a certain value, which in a lot of contexts is nonsense, but in some contexts makes a LED light up.
Yeah, the article comes off as needing so much context that the article itself is sus. Like
Hejlsberg stated the obvious when saying that TS isn’t the fastest language. Although it can laughably run Doom at 0.0000009645 fps.
… which is referencing an implementation of Doom in the TS type system. It’s a funny idea, but an arbitrary reader who doesn’t know about that and doesn’t bother clicking through will get a very wrong impression.
The reimplementation (which they’ve done partially automated; Go apparently lets them do a very simple translation while Rust or C# would require more work to fit) should be a boon for TS devs, but not noticeable for those who just run stuff that happens to be written in TS.
Would be kinda interesting to see the effect if stuff targeted deno
rather than node
, though.
This is the first I’ve heard something like that about Iceland; but I do know a little bit about Icelandic personal ID numbers.
TIOBE literally ranks languages by search results. It’s at best a measure of SEO. It is, generally, a trash metric that shouldn’t be used for anything.
I’ve moved on from vim to neovim, and I think I’ll continue using something in that family in the future. It’s a pretty stable experience overall, but the inclusion of LSPs and tree-sitter have been good improvements too.
Ultimately editors are tools, similar to keyboards, os-es, screens, chairs, shoes and so on. There are some objective quality differences between a well-constructed tool and some slapdash nonsense, and there are a huge amount of subjective quality differences. What suits me may not suit you, and vice versa.
It’s generally good to try out some new (to you) stuff and see if you like it. If you do, great; if you don’t, well, now you know. I think my worst experience was with Acme (or Wily? can’t remember), during a phase where I experimented with Plan 9 stuff. Ultimately very not my cup of tea, but apparently Rob Pike (who made it) and some other gophers still enjoy it? Which is good for them, just like it’s good for me that I can choose not to use it. It’s just personal tastes, and I still think it’s good that I gave it a go.
The debate over holding down modifier keys vs modes is also a part of the Emacs vs vi debate from many decades ago. There might be some statistics for what works best for the most people now, but again, use what suits you. And try some new stuff when you get curious, it’s generally good for you.
Where’s the Orphan Crushing Machine community here anyway
Smells a bit Scandinavian to me. In Norwegian we also use “ur” that way, including “urspråk” (Ursprache, ur-language). We have a different word for origin (opphav), so ur remains a prefix that’s difficult for us to translate.
Going by Wikipedia however, the English translation for Norwegian urspråk and German Ursprache is proto-language.
Yeah, same. Post-metal or thereabouts towards jazz can work too IME. Stuff like Russian Circles, Earthless, Elephant9. But stuff like Waveshaper and Amynedd are often safer bets.
Smartphones – and to a lesser degree, tablets – kind of are not a phenomenal programming platform.
[…]
But not everyone in 1990 had a personal computer, and I would venture to say that the group that did probably was not a representative sample of the population. I’d give decent odds that a lower proportion of the population as a whole could program in 1990 than today.
Yeah, and these things influence each other. Today we have a networked computer in our pockets, and depending on where you live, they may or may not be required or the standard way to do tasks like get a bus ticket, login to government websites so you can do your taxes and whatnot, transfer money, and a bunch of other tasks that to a degree are really sensitive.
So as we have a bunch of barely computer-literate people functionally dependent on these devices, we also need them to be locked down and secure. MS had some grand thoughts about “code everywhere”, which turns out is pretty awful security-wise, especially with gullible networked users. The users in this community have very different capabilities and needs than the users who might not even want a computer, but feel forced to get one because the government stopped using paper and bank and post offices no longer exist. (This is, essentially, what it’s like in modern Norway. We might be ending home delivery of snail mail soon; mail delivery every other weekday seems to be an unnecessary expense.) Beyond the lack of a keyboard, the platform has a bunch of constraints that don’t make for fun computing, but they absolutely need to be there. Unfortunately we also wind up with a split between the common restricted platforms, and the casual, customizable platforms, and not everybody gets to be exposed to the latter.
There are probably, in absolute numbers, a whole lot more people who know js or Python than people who knew BASIC in the 80s. In addition there are people who are pretty good at spreadsheet programming, and other tasks that are essentially coding, even if they’re not listed as regular programming languages.
Quotes are OK, shellcheck is happy, but, according to gtfobins, you can abuse tar, so running the script like this: ./test.sh /dev/null --checkpoint=1 --checkpoint-action=exec=/bin/sh ends up spawning an interactive shell…
This runs into a part of the unix philosophy about doing one thing and doing it well: Extending programs to have more (absolutely useful) functionality winds up becoming a security risk. The shell is generally geared towards being a collection of shortcuts rather than a normal, predictable but tedious API.
For a script like that you’d generally want to validate that the input is actually what you expect if it needs to handle hostile users, though. It’ll likely help the sleepy users too.
Yeah agreed on the 100 lines, or some other heuristic in the direction of “this script will likely continue to grow in complexity and I should switch to a language that’s better suited to handle that complexity”.
This just looks like the average Norwegian
I think I mentioned it, but inverse: The only data type I’m comfortable with in bash are simple string scalars; plus some simple integer handling I suppose. Once I have to think about stuff like "${foo[@]}"
and the like I feel like I should’ve switched languages already.
Plus I rarely actually want arrays, it’s way more likely I want something in the shape of
@dataclass(frozen=True)
class Foo:
# …
foos: set[Foo] = …
Isn’t that sort of just the cost of doing business in C? It’s a sparse language, so it falls to the programmer to cobble together more.
I do also think the concrete example of emails should be taken as a stand-in. Errors like swapping a parameter for an email application is likely not very harmful and detected early given the volume of email that exists. But in other, less fault-tolerant applications it becomes a lot more valuable.