LoL, an insane amount of things. TCP connections are an illusion of safely, for the purpose of database commits use UDP packets as a model instead, it'll be much closer to reality.
I have been using Gentoo 10+ years, you can save time using other distros but you'll waste an insane amount of time as soon as you want to properly configure a yubikey or set up a modern compilation stack (or update specific drivers or fixing the kernel).
Compiling everything seems like an overkill, but remember it's unattended time, just start the upgrade before dinner twice a week and you'll be good to go.
(Failure to compile in Gentoo are extremely rare, less than once a year)
The upside is deciding exactly which packets you want and which versions, and whenever you want to customize( or fix) a specific thing it's trivial.
I have been using kde for 15+ years, except 4.0, which was painful, everything has been mostly a smooth experience.
> However, KDE considered my TV the primary desktop and put the task bar only in that monitor, and even disabling the TV didn't add the task bar to my monitor.
You can order the screens however you want; the first one will be considered primary.
At least on the version currently on Debian, systemsettings has a "primary" radio on the screen configuration panel that let you change it to whatever monitor you want, on whatever order you want.
Yes, but I assumed that disabling the TV would set the monitor as the primary desktop and added the taskbar to it, but it didn't. Now I may have done something wrong, but I was just reporting my experience.
It remembers the screens to try to keep your settings if you disconnect and reconnect external screens, but in this case that was not very helpful
I always want the taskbar on every screen personally. I think that'd be a friendlier default, but since it's KDE it's at least not too hard to change, and everything is configurable down to fine details
If unplugging the display cable works though. It's most likely the TV pretending to be still on.
I have a LG TV C1 that behaves like that. While my computer monitors do not have this issue.
The TV even has a dual personality. It doesn't appear to report the same informations via EBID when powered off vs powered on.
I also have a MS Windows 10 connected to this same TV, and if I make the mistake of powering up or wake from sleep Windows before turning on the TV, then the NVIDIA GPU setup some broken resolution. And only a reboot fixes it.
So my guess is it's the TV presenting itself with different EBID when off vs powered on. And also somehow presenting itself as active on the HDMI line no matter if off or on. Changing the TV inputs also doesn't tell KDE that the display was turned off.
I haven't debugged any of it. These are just my observations.
Author here: I didn't unplug the display, I went to the settings and disabled the TV. I am not saying that I didn't do anything wrong, but I expected that disabling the TV would make the monitor the primary display and move the taskbar to it.
Nice! On paper looks really promising!
There is actual need for embedded databases as SQLite is painful for highly concurrent programs (I actually hit this issue in rust)
Thanks! yeah, SQLite's write lock is painful for concurrent apps.
I'm comfortable with kernel development, so I brought some of those patterns here - RCU-style lock-free reads, per-CPU inspired sharded buffers, and io_uring for kernel-bypass I/O.
would love to hear your thoughts if you had the chance to give it a spin :)
They recently released a huge bump of their AI package, they probably wanted to differentiate between the older and newer version...still they could have renamed the plugin, or at very least, mark older comments as "for an older version"
I actually pushed a fix for the soundcard upstream back in kernel 5.13, it had been working great. And yeah, don't expect a Wacom level pen, it doesn't come with tilt. I never trust/tried sleep on Linux(could corrupt the FS), but everything is working great on my side.
LoL, an insane amount of things. TCP connections are an illusion of safely, for the purpose of database commits use UDP packets as a model instead, it'll be much closer to reality.