Leap seconds still make time go forwards, not backwards. NTP clients would also resolve small time discrepancies while still advancing forwards prior to the next time sync.
Leap seconds still make time go forwards, not backwards. NTP clients would also resolve small time discrepancies while still advancing forwards prior to the next time sync.
I didn’t say Unix time, I said UTC. And no it won’t report negative time, not unless somehow the system clock was modified while it was running…
UTC always goes forward regardless of the timezone and local time. That is why you should use it. To take my EPG situation above, I stored program start / end times in UTC so they would render properly even if DST kicked in or not during the middle of the program.
Yes as long as the rules are known, but it’s really just better to do things sanely and leave no margin of doubt.
True but so do most computers. Computers have a database of timezones and time offsets around the world. Depending on the UTC date and time, and your current timezone it will look up what offset to apply to show the local time. The database is very gnarly since rules change over time, e.g. maybe in the 70s some countries had longer DST to counteract oil shortages.
I once developed an electronic program guide for a cable TV company in New Zealand and I’d lose my mind if I had to use timezones. The basic rule of thumb was:
a) Internally you use UTC religiously. UTC is the same everywhere on Earth, time always goes forward, most languages have classes that represent instants, durations etc. In addition you make damned sure your server time is correct and UTC.
b) You only deal with timezones when presenting something to a user or taking input from a user
Prior to that I had worked for a US trading company that set all their servers to EST and was receiving trades through the system which expressed time & date ambiguously. Just had to assume everywhere that EST was the default but it was just dumb programming and I bet to this day every piece of code they develop has time bugs.
The EFF has some info about the practice - https://www.eff.org/pages/list-printers-which-do-or-do-not-display-tracking-dots.
I imagine there are ways and means of obfuscating / anonymizing the dots such as blocking the printer from emitting them (e.g. an empty yellow cartridge that the printer perceives as full), modifying the firmware, using a burner printer, or using a mono laser jet.
As a side issue, most modern bank notes have a bunch of yellow circles integrated into the design on each side. They look random but they’re in a recognisable pattern called a constellation that enables devices like copiers / scanners to recognize when people are trying to copy money or other financial instruments like checks.
Rust isn’t really OOP like C#, Java or C++ - it has structs with functions that you could consider an “object” but there is no inheritance. Instead Rust uses traits which are a little bit like interfaces in some languages.
The way the kernel is using Rust at the moment is to produce safe bindings for modules to be written in Rust, i.e. you can create a module in Rust source which will be correctly loaded up, the code is safe by default and will have access to kernel services via bindings. I expect over time that more of the kernel will become Rust, but the biggest impediment right now is Rust relies on LLVM and LLVM only supports a subset of targets that a kernel could potentially support with another compiler like gcc.
Predominantly C. But even the kernel is beginning to use Rust as a way of avoiding entire classes of programming error.
The only reason people use JS is because it’s the defacto language of browsers. As a language it’s dogshit filled with all kinds of unpleasant traps.
Here is a fun one I discovered the other day:
new Date('2022-10-9').toUTCString() === 'Sat, 08 Oct 2022 23:00:00 GMT'
new Date('2022-10-09').toUTCString() === 'Sun, 09 Oct 2022 00:00:00 GMT'
So padding a day of the month with a 0 or not changes the result by 1 hour. Every browser does the same so I assume this is a legacy thing. It’s supposed to be padded but any sane language would throw an exception if it was malformed. Not JavaScript.
Lemmy is written in Rust. There might be bits of C at the periphery behind bindings.
Haven’t been to Thailand since my 20s (with girlfriend) but there were plenty of bars where the hostesses seemed like they would perform other services if the clientele had money for them. Not to mention actual brothels and sex shows. I went looking around Pattaya on Google Maps to see if it was still like that and was disappointed to see that many of the streets had been knocked and big resort hotels built there. Still a few dodgy areas though still.
The problem is, that most languages have no native support other than 32 or 64 bit floats and some representations on the wire don’t either. And most underlying processors don’t have arbitrary precision support either.
So either you choose speed and sacrifice precision, or you choose precision and sacrifice speed. The architecture might not support arbitrary precision but most languages have a bignum/bigdecimal library that will do it more slowly. It might be necessary to marshal or store those values in databases or over the wire in whatever hacky way necessary (e.g. encapsulating values in a string).
We had tens of thousands of lines in our rake files to build a bunch of targets, none of which were even Ruby. I think if I needed to build another complex build system that was a directed acyclic graph I think I’d use Gradle, for a several reasons - we had some Java targets so we save on an additional developer runtime, it would run faster & Gradle is more mainstream and easy to get various plugins & documentation for.
It probably wasn’t a big deal when it was a niche project until Twitter imploded. Then all the public instances got overloaded with new users and the limits became obvious.
A better design is Lemmy which is written in Rust so it has far more scalability. It’s compiled and because it’s tokio / actix based, it can also do a lot more stuff asynchronously so it’s not spawning thousands of threads to cope with concurrent requests.
There is a lot of magic in Java. Try Spring Boot for example, and things magically connect together with annotations, or somehow methods get injected onto interface on the fly, or an http interface maps onto a function with parameters because the runtime is doing it. This is most evident when you set a break point in some class and there might be 4 or 5 mystery functions it passed through between it and where you thought it was calling from. Sl4j, Lombok, Hibernate are doing the same kind of thing.
I wrote extensively in Ruby but for Rake - using Ruby as a build system. Can’t say I liked the language although it was okay for how we used it. We have 20 sub projects with some very complex build targets and dependency scanning going on and the Rake syntax was okay. Personally I think its biggest shortcoming was the documentation was very poor and stuff like gems felt primitive compared to other package management systems. One thing I liked from the language was blocks could evaluate to a value which I really use a lot in Rust too.
I think if I were doing an acyclic dependency build system these days I’d use Gradle probably.
As for Rails I expect failed to catch on because even compared to Python, Ruby is a slow language. And Python isn’t fast by any stretch. Projects that started with Rails hit the performance brick wall and moved to something else.
Cash is off the books so there is an incentive for certain kinds of businesses like tradesmen to take cash because it still works out cheaper since they don’t have to declare it to the taxman.
Not necessarily. It might be privacy but it could also be a combination of other reasons too - a cultural aversion to paperless transactions, a lack regulation for electronic payments, lack of a decent indigenous payment system, lack of financial safeguards, prevalence of fraud / skimming devices etc.
Some European countries were more into electronic transactions than others but with stuff like SEPA, chip & PIN, contactless payments I think most people are just fine using electronic payment unless they have reason to control the transaction in some way. For example I usually pay pretty much everything electronically but I still pay taxis and most restaurants with cash. Also tradesmen if they’ll give me a discount for cash.
It doesn’t work like that. UTC goes forward always. Leap seconds are scheduled and known in advance. NTP time services will just smear time advancement a little to account for an additional second. Time never has to go backwards. This is how Google does it.