Hacker Newsnew | past | comments | ask | show | jobs | submit | more Den_VR's commentslogin

Since 22C3 I really enjoyed watching online and chatting with a small irc community about it. I had this notion that if I ever lived in Europe I’d go myself. Well for the last three years it seems I haven’t gone - the ticket situation was a shock at first but makes sense. The number of unrecorded talks does feel like it’s gone up though which has been regrettable.


If you're associated with any hacker group you should ask them if they're included in the pre-sale round. You can get one then if they don't run out.


Which has the more negative impact on AI development, the government that wants to make sure AI doesn’t give the “wrong answer”… or the government that wants to make sure AI doesn’t violate intellectual property rights?


Three things, not all of which any specific employee does: 1. Fix security issues 2. Create “features” in order to seem useful that the world was better without 3. Rest upon laurels of gmail from 15 years ago


As I recall, Gartner made the outrageous claim that upwards of 70% of all computing will be “AI” in some number of years - nearly the end of cpu workloads.


I'd say over 70% of all computing is already been non-CPU for years. If you look at your typical phone or laptop SoC, the CPU is only a small part. The GPU takes the majority of area, with other accelerators also taking significant space. Manufacturers would not spend that money on silicon, if it was not already used.


> I'd say over 70% of all computing is already been non-CPU for years.

> If you look at your typical phone or laptop SoC, the CPU is only a small part.

Keep in mind that the die area doesn't always correspond to the throughput (average rate) of the computations done on it. That area may be allocated for a higher computational bandwidth (peak rate) and lower latency. Or in other words, get the results of a large number of computations faster, even if it means that the circuits idle for the rest of the cycles. I don't know the situation on mobile SoCs with regards to those quantities.


This is true, and my example was a very rough metric. But the computation density per area is actually way, way higher on GPU's compared to CPU's. CPU's only spend a tiny fraction of their area doing actual computation.


> If you look at your typical phone or laptop SoC, the CPU is only a small part

In mobile SoCs a good chunk of this is power efficiency. On a battery-powered device, there's always going to be a tradeoff to spend die area making something like 4K video playback more power efficient, versus general purpose compute

Desktop-focussed SKUs are more liable to spend a metric ton of die area on bigger caches close to your compute.


If going by raw operations done, if the given workload uses 3d rendering for UI that's probably true for computers/laptops. Watching YT video is essentially CPU pushing data between internet and GPU's video decoder, and to GPU-accelerated UI.


Looking at home computers, most of "computing" when counted as flops is done by gpus anyway, just to show more and more frames. Processors are only used to organise all that data to be crunched up by gpus. The rest is browsing webpages and running some word or excel several times a month.


There are a lot of old feds that can afford the area because of how it was priced 25-40 years ago. That’s how stable some fed jobs and careers were.

Now, talking to a barista in DC and the solution is 4-5 roommates. Not unfamiliar to those in the bay area, but less upside.


Less upside…unless your goal is to mix it up with politics,

vs the tech machine.

Not everyone is you, us.


I don’t think so, no.


The talent that would do such a thing gets acquihired while the competition meets an early end.


If there’s more work than resources, then is that low value work or is there a reason the business is unable to increase resources? AI as a race to the bottom may be productive but not sure it will be societally good.


Not low-value or it just wouldn't be on the board. Lower value? Maybe, but there are many, many reasons things get pushed down the backlog. As many reasons as there are kinds of companies. Most people don't work at one of the big tech companies where work priorities and business value are so stratified. There are businesses that experience seasonality, so many of the R&D activities get put on the backburner until the busy season is over. There are businesses that have high correctness standards, where bigger changes require more scrutiny, are harder to fit into a sprint, and end up getting passed over for smaller tasks. And some businesses just require a lot of contextual knowledge. I wouldn't trust an AI to do a payroll calculation or tabulate votes, for instance, any more than I would trust a brand new employee to dive into the deep end on those tasks.


Watson X is still a product line sold today to qualified customers :)


It’s computer fraud by the operator and wire fraud by the client…


It's ultimately spammers and scammers fighting against each other, so you could say the trash is taking itself out. Every dollar they spend fighting is one less dollar spent spamming actual users, so it's a win.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: