

I don’t really trust IBM to know what they’re doing, but it’s still a nice sign.
I code and do art things. Check https://cloudy.horse64.org/ for the person behind this content. For my projects, https://codeberg.org/ell1e has many of them.


I don’t really trust IBM to know what they’re doing, but it’s still a nice sign.
I can’t really decide what extensions my users will face, once they are supported. Therefore too many extensions seems bad to me.
INI can be nicer for non-techies due to its flat structure. However, TOML seems to be in an awkward spot: either I want flat approachable (I’ll pick INI) or not (I’ll pick JSONC). Why would I want a mix?


The lack of intelligence is inherent for LLMs: https://www.forbes.com/sites/corneliawalther/2025/06/09/intelligence-illusion-what-apples-ai-study-reveals-about-reasoning/
This is likely why Apple is the only big tech company that hasn’t entered the AI race with tons of debt and tons of data centers. They’re likely seeing the writing on the wall.
While there could be a new technique arriving to solve this some day, there also may never be one.


It’ll backfire for any non-trivial code base at some point. LLM plagiarized code is just too inherently lacking any sense of big picture. Gen AI doesn’t have the necessary intelligence. I keep linking it but it keeps being relevant: https://www.forbes.com/sites/corneliawalther/2025/06/09/intelligence-illusion-what-apples-ai-study-reveals-about-reasoning/


I doubt it. https://www.forbes.com/sites/corneliawalther/2025/06/09/intelligence-illusion-what-apples-ai-study-reveals-about-reasoning/ Gen AIs are literally so unable to have any basic logical thought, I think this is merely the hype.
To anybody still being scared, watch this: https://www.youtube.com/watch?v=3400S4qMH6o
There are solutions to this like having a doc comment right next to the function which is picked up by some API generator. Then it’s easier to keep in sync. That can work well even in languages without explicit parameter types.
Of course it won’t help LLMs as much, but I personally don’t mind that.
Seems like a solution would be 1. update docs and 2. don’t use LLMs to code.
Configs are often shared, just to explain my reservations with TOML. For my project, I used INI instead.