Not necessarily. Using a 64-bit system and think you're safe? But what if the binary you're running has been compiled for a 32-bit system? What if this is a cloud service you're not even aware of?
As a true example here, my parents' house uses a simple computer to automate watering of different places in their garden. Looked the chip up, 32 bit system. So this will definitely stop working on Jan 2038.
This is just the tip of the iceberg. 2038 is going to be a big issue.
It's not as simple as that. I had a C library using 64-bit time_t on 32-bit CPUs back in the early 1990s. 32-bit CPUs are not incapable of doing 64-bit arithmetic. You cannot infer the bitness of time_t from the bitness of the CPU.
It doesn't run 64bit code but that has nothing to do with handling 64bit numbers. Compiler just generates more instructions (and the execution is slower). Example for multiplication:
https://godbolt.org/g/m1t4NC
Nuclear power plants run critical systems on mostly analog and sometimes pld devices (much easier to formally verify). They also generally don't care about date-time and have rigorous testing/verification. Nuclear PP are not something to worry about for a 2038 bug. It's the industries that don't employ traditional engineering practices that are worrisome.
...which control at least one plant in Canada (though recently it was replaced with an emulation.)
And apparently they solved the 16-bit data problem, but maybe not the 32-bit date problem.
"A current generation DCC system resides at L3 MAPPS' main Montreal, Canada facility and is used to provide support to all participating COG members until 2035."
It really varied, depended a lot on the culture of the company itself. Some shops do anything their devs want while others simply ignore them and do whatever some exec decided to do.
The that interested me most was finding the reason why. Sometimes it was ridiculously silly.
Aren't you afraid of radiation at all? I've used this app for a few months, but actually stopped because I didn't like having the device so close to my head for such a long period of time.
Been doing software for 20 years now, and I think the best advice would be to keep things simple.
Keep your software design simple, your code simple, your tools simple. Sure, it's nice to use new and shiny tech for your side projects, but real software should be done simply and efficiently.
Simple trumps everything else, but it takes a real master to know how to pull it off. Especially in the context of complicated software requirements.
During my career I've met very few programmers who are capable of doing this well. It's the true art of the profession IMO.