Hacker Newsnew | past | comments | ask | show | jobs | submit | mort96's commentslogin

That would be fine if I could put it in an M.2 slot. But all my computers already have RAM in their RAM slots, and even if I had a spare RAM slot, I don't know that I'd trust the software stack to treat one RAM slot as a drive...

And their whole deal was making RAM persistent anyway, which isn't exactly what I want.


Optane M.2-format hardware exists.

Interesting, all I ever saw advertised was that weird persistent kinda slow RAM stick. Does the M.2 version just show up as a normal block device or is that too trying to be persistent RAM?

Just normal (and fast) block storage.

Iirc it wasn't great because higher power == more heat though

That could be addressed with a small NVMe heatsink. They're available and their use is advised already for NAND PCIe 4.0 and 5.0 hardware, but they would fit the Optane use just as well.

I never understood what they're meant to do. Intel seemed to picture some future where RAM is persistent; but they were never close to fast enough to replace RAM, and the option to reboot in order to fix some weird state your system has gotten itself into is a feature of computers, not a problem to work around.

When the PDIMMs were used with an appropriate file system + kernel, it was pretty cool. NTFS + DAX + kernel support yielded a file system where mmap’ing didn’t page fault. No page faults because the file content is already there, instantly.

So if you had mmap heavy read/write workloads… you could do some pretty cool stuff.


iOS is great! I especially like how they tied the "alarm through speakers" volume to the "notifications through earbuds" volume so that you can choose between "loud enough to wake you up in the morning and also burst your ear drums when you receive a notification with earbuds in" and "quiet enough to make notifications comfortable and also not wake you up in the morning". Truly genius parody of hostile UX.

Their audio router (if any) needs a rewrite

I particularly love how they will periodically choose to only use the selected Bluetooth device for audio output, and will instead switch back to the builtin microphone. The builtin microphone may be in my pocket or across the room, and so the only indication I get is when the person on the other end of the line says that I’ve dropped off.

Nothing changes in the UI to indicate this, nor could I find any setting to change this. Sometimes swapping the audio away from the headset and back to it helps, but it it at best a temporary fix.


I can't see where the article defines how it measures "productivity". Is it just words produced per hour?

Journalism is, I imagine, much like programming: a lot of the words are "boilerplate" and cheap to produce, but those aren't the important parts of a story. Some of the words require a lot of work. Getting a direct quote from a relevant person. Doing the deep research to expose a claim as false instead of blindly parroting it. Getting multiple sources to voice contrasting views on a topic. Fact checking an article before publication.

I worry that whatever their definition of "productivity" is, it ignores these important yet time consuming aspects, and as such, what looks like "increased productivity" in their metrics is really just a decrease in quality.


LOC equivalent of the news!

Also the whole manufactured consent thing

I have absolutely no interest in RGB anything in my computer. Yet I've occasionally ended up with all these RGB parts -- RGB LED on my mouse, RGB RAM sticks, RGB GPU -- just because it's the best alternative right then and there, it's wild. It's at the point where you sometimes have to go with a worse price/performance option or otherwise suboptimal choice just to avoid the stupid useless little RGB LEDs.

Yeah, if people were still doing LAN parties, I’d want to bring the equivalent of a sleeper car. Maybe empty out that beige AT case with the turbo button on it.

That, and using old G4 or G5 Mac cases are very common projects.

That's when black PVC tape is your friend. Just used that earlier today because some cretin decided that putting a 10,000lm blue LED strip on a bedside phone charger is a good idea.

(It's actually a very nice charger, except for that --ing LED strip).


backlit keyboards are an ok idea..

The Starlink terminal can't know based on only its position which side it's being used by. Equipment is often used in enemy territory.

That is a tiny minority of the use. The vast majority of Russian use has been on Russian controlled land.

Sure. But if you geoblock all use on Russian controlled land, you're also blocking Ukrainian use on Russian controlled land. I have no idea if that would cause issues or not, but it's not that far fetched to imagine it might.

No, you can't argue that the best medium is dirt. Just like you can't argue that the best medium is vinyl.

But you could maybe argue that there are advantages to dirt (at least a hypothetical dirt which can be used as a musical medium somehow) which you lose by going to CD or vinyl. If this hypothetical dirt managed to be constraining in such a way that it produces kinds of musical works which would not have been produced for CD, is that not an advantage?


Yes, but in most well functioning large organizations, that happens very rarely.

I would have more faith in Raspberry Pi's own patched build of Chromium to do hardware acceleration properly on the Pi than I would have in Google's generic Chrome build.

But as far as i know, there has never been working HW accelerated video in their build of Chromium. But yeah, I guess you have point.

If enough people ask Broadcom to do VA-API on their platforms getting that enabled on Chromium shouldn't be too hard. The caveat is that codec support is limited (no VP8/VP9/AV1).

A fully statically compiled Linux ARM64 binary which only interacts with the kernel through syscalls should run no problem on ARM64 Android. From the kernel's perspective, there is no difference between a "Linux binary" and an "Android binary" because the kernel in Android is Linux.

Most programs want to interact with various system libraries and system services though. Android and your typical desktop Linux system share pretty much nothing aside from the kernel.


Why is it easier to run a Linux ARM64 binary on Android than to run an Android ARM64 binary on Linux?

My guess is that the reason is the same reason that there aren't official updated Android containers


I don't know what you mean by an "Android ARM64 binary". If you make an ELF file containing ARM64 machine code, it doesn't matter to Linux whether you meant for it to run on Linux in an Android system, on Linux in a desktop GNU system, or on Linux in some environment with without much of a userspace at all (such as a stripped down initramfs environment).

If you mean something like an Android app, the answer is that there's a ton of system stuff that the app depends on, it interacts with more than just the kernel.


.

  CC= clang
  CXX=clang++
  $CC hello.c -o hello_android_c
  $CXX hello.cpp -o hello_android_cpp -static-libstdc++
  $CXX hello_asm.cpp -o hello_android_cpp_asm_syscalls_only -ffreestanding -nostdlib -fuse-ld=lld
  find -name hello_android -exec readelf -l {} \;
But go binaries don't require (bionic) libc unless you compile with CGO_ENABLED=1

Only if they restrict themselves to the officially supported syscalls, otherwise Android will kill the application.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: