Hacker Newsnew | past | comments | ask | show | jobs | submit | hu3's commentslogin

It obvious does, with reference counting. Otherwise programs would just balloon in memory.

Just yesterday, my colleague's mac Time Machine couldn't recover backup and they had to reinstall everything.

But I think this predates Tahoe.


Silent corruption has been a feature of Time Machine for the last 19 years. But haven't you seen the new glass effects, isn't it cool?

What about real workloads? Because as context gets larger, these local LLMs aproxiate the useless end of the spectrum with regards to t/s.

I strongly agree. People see local "GPT-4 level" responses, and get excited, which I totally get. But how quickly is the fall-off as the context size grows? Because if it cannot hold and reference a single source-code file in its context, the efficiency will absolutely crater.

That's actually the biggest growth area in LLMs, it is no longer about smart, it is about context windows (usable ones, note spec-sheet hypotheticals). Smart enough is mostly solved, combating larger problems is slowly improving with every major release (but there is no ceiling).


The thing about context/KV cache is that you can swap it out efficiently, which you can't with the activations because they're rewritten for every token. It will slow down as context grows (decode is often compute-limited when context is large) but it will run.

That should be covered by the harness rather than the LLM itself, no? Compaction and summarization should be able to allow the LLM to still run smoothly even on large contexts.

Sometimes it really needs a lot of data to work.

It did change. They bumped $200 on the entire line. So even the 16GB version is more expensive.

I'd love to have customers like Apple. Bumps $200: "it didn't change!!!"

And no power adapter included.


You mean bumped $100. M4 MacBook Pro and M5 MacBook Pro started at $1599 with 512GB SSD.

Now it starts at $1699, a $100 bump but comes with a 1TB SSD. Previously it would have cost $1799 for the 1T SSD, so it's a $100 bump on base price but you are also getting 1TB SSD for $100 less than before.


To me, this is kind of like Telecom providers giving you bandwidth headroom that realistically should have been there for a long time, but removing the option to get a cheaper plan whether you'd otherwise pay for the upgrade or not.

Like for my last upgrade, I bit the bullet and upgraded to 1TB for the first time ever instead of base storage at Apple's absurd prices, so it's good, but if I'd not have been willing to spend money on that at all, they lifted the floor.

My cell phone plan has been increasing every year by small amounts, but my usage pattern hasn't changed, and meanwhile they've restricted HD streaming using Deep Packet Inspection or whatever, so I theoretically have a 100GB full speed cap but can't practically use more than 20gb anyway, so they're pricing the bandwidth into the contract but I can't save money by getting a lower ceiling


> I'd love to have customers like Apple. Bumps $200: "it didn't change!!!"

Try making a good product that people love?


The base storage increased as well, and the upgrade prices for RAM are the same, which is where the real issue was.

> And no power adapter included.

To be fair, ever since the advent of high power USB-C PD that really, really is not needed any more, way too many power bricks are effectively e-waste.

People already have USB-C power bricks and docks everywhere and unlike pre-USB-C generations, you can use them not just across different generations of hardware, but across vendors as well.


I doubt if that many have USB-C high power bricks unless they are upgrading from another USB-C laptop.

> unless they are upgrading from another USB-C laptop.

Which MacBooks have been for almost a decade - the 2016 MBP with Touch Bar was the first that went fully USB-C PD. Anyone who has had a MacBook in that time frame will have had at least one high power USB-C PD wall wart.

The Windows world, as usual, has been different, but even there, I'm not aware of any mainstream model being sold in the last two years without even a single PD capable port.


> It did change. They bumped $200 on the entire line.

I wonder if that would happen regardless of RAM, e.g. for tariffs etc.


The EU forbids them from including power adapters. They're still included everywhere else.

EU doesn't forbid including. The new law requires there to be an option without the adapter. If the manufacturer chooses so they can have an option with and without the adapter.

I can buy a laptop right now close to home and it comes with power adapter.

Except that it's literally not true and people are repeating it for some stupid reason, I assume you just never actually looked it up - laptops are specifically excluded from that regulation, and in fact Apple does bundle a power adapter with their laptops, just not on the cheapest models.

> in fact Apple does bundle a power adapter with their laptops, just not on the cheapest models.

Here in the UK, they no longer include the power adapter even with the top models. I just specced out a fully-loaded M5 Max Macbook Pro, 128GB RAM, 8TB storage on the Apple Store, and it doesn't include a power adapter by default.

The 140W power adapter can be added as an option to the MacBook Pro for an additional £99 + VAT, or purchased separately. If you purchase separately you can of course choose a lower-power adapter for a lower price.

Now that a power adapter isn't included and you have to pay for it separately, it might make more sense to get one of the good brands of GaN power adapters instead, because they are smaller than the Apple ones for the same power, and have more ports.


>>Here in the UK, they no longer include the power adapter even with the top models

That's incredibly stupid(of apple), I'm in the UK and literally got my M4 Max MacBook Pro delivered on Friday, it came with a power adapter.


Are you going to return it for an M5?

No, it's provided by my employer so I don't really have that choice. And it's a the 16 core M4 Max, 64GB ram and 4TB storage, it's not really lacking in any way, it's a beast of a machine.

(But yes if I bought this with my own money I would have swapped lol)


Recycling when it could run Linux after Apple stops supporting it? That would be much better for the environment.

Which is slow and heavy in Rust. All languages have that but faster (and simpler due to no lifetimes).

cargo check is fast. It's only slow when the build goes through (barring extreme use of compile-time proc macros, which is rare and crate-specific).

i mean as a first order approximation context (the key resource that seems to affect quality) doesn't depend on real compilation speed, presumably the agent is suspended and not burning context while waiting for compliation

Which it obviously can't be because it has an anemic standard library and depends on creates for basic things like error handling and async.

Not to mention it's one of the slowest compilation of recent languages if not the slowest (maybe Kotlin).


But there is no language that is best in all of these dimensions (including ones described above).

Everything is a trade-off.


Java has decade(s) of cruft and breaking changes which LLMs were trained on. It's hard to compare. Plus Go compilation speed/test running provides quick iteration for LLMs.

All you need is a JAVA_CODING_GUIDELINES.md with some hints about what kind of Java code you like the agent to write.

All you need is [to encode all of the lessons you’ve learned about style and best practices over your entire career into a single markdown file].

Sorry, couldn’t help myself.


> Plus Go compilation speed/test running provides quick iteration for LLMs

How is Java slow in any of these? Plenty of tricks e.g. AppCDS before the tests.


breaking changes? hardly.

Yes, breaking changes. And many ways to do the same thing because the language kept evolving (thankfully).

There is a decently long list of breaking changes now. Removing JavaEE modules from the JDK, and restricting sun.misc.Unsafe, are the ones people usually run into.

These are relatively small-scoped library changes only though.

Meanwhile Go already had a language change, while being less than half its age (loop variable capture).


A long enough list of small changes eventually equates to a big change. People generally can't update applications from Java 8 or below to a new one without code updates.

Mostly an automatic updater call (for javax->jakarta) and a few dependency bumps away in the majority of cases.

Plain vanilla java code is really backwards compatible, on both a syntax, and on a binary level.

You can often find decades old jars on some random university site, working with JDK 25 with no issue.


If hadoop did it, so can you. I'm talking about a project that stretched Java 8 to, and arguably beyond, its intended operational boundaries. Unlikely that you’re leaning on this boundary. It's Spring Boot upgrades that will be giving you troubles.

They clearly did have Java version issues, as the different Hadoop versions list ranges of JDKs they're compatible with.

At the very least it is one less TSMC 3nm chip in the hands of competition.

So even if they break even, which I highly doubt, they would rather use it in a kids tablet than let the competition use it to power a flagship phone.


C# + https://monogame.net

- Desktop: Windows, MacOS, Linux

- Mobile: Android, iOS, iPadOS

- Console: Playstation 4, Playstation 5, Xbox One, Nintendo Switch

It used to be XNA but then Microsoft discontinued and the community created the API compatible MonoGame.

Notable games: Terraria (when it was XNA), Stardew Valley, Celeste, Terraria and Fez.


Note than many games have used Monogame on console but FNA on PC instead.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: