Very neat! I am wondering whether it could also recognize multiple fingers and thus sort of emulate multitouch gestures, although my guess is latency is not great with one finger already, so this might not work well.
Als interested to see how it works with different lighting and how dirty the screen can get before recognition fails because the reflection is too „muddled“ ;)
That is Video RAM for the GPU, though not the main RAM for the system. The minimum RAM amount you can buy the iMac Pro with is 32 GB, 64 GB and 128 GB configs are also available
Also, 18 GB was certainly a typo, if you look now, they are saying 16 GB now.
I guess it could very well be that the mentality of Apple and Google employees could lean somewhat into those philosophies.
On the other hand, your described Apple culture sounds to me a heck of a lot more sustainable in the long run (I mean, Apple has now been around nearly 40 years now), especially since the "become a bazillionaire" part from Google really only works for founders and early employees with enough stock and only for very few selected unicorn companies.
Maybe another reason why Apple spawned comparatively few startups might be that some of the areas of expertise for Apple and their employees are harder to do in startup-sized companies; namely hardware engineering and mass manufacturing innovation.
Apple does a lot of work in hardware and hardware manufacturing, both areas which are pretty capital-intensive and might not lead themselves as easily to the startup world.
Say, you are an Apple engineer working on the Ax chips for the next iPhone and have an idea for something great in CPU design, you cannot exactly rent a scalable 14nm chip fab from AWS to try to build it on your own and sell it market.
Now "an idea for something great in CPU design" is unlikely to result in a marketable chip these days, but chip startups exist, and some are very small. Adapteva for instance is a 5-people company.
I was pondering the viability of almost exactly this. Could one invest in a hardware pipeline that efficiently combines e.g. 3D printing with FPGAs to create the consumer-electronics equivalent of Lulu.com's just-in-time small-batch book printing + drop-shipping.
Or, in other words: you can do pretty much anything with the sensors in a modern smartphone. All most manufacturers probably need is smartphone boards (with maybe some options of extra sensors) in custom cases with fancy buttons and displays, maybe custom remotes, and a cute little 4-colour double-walled box. That describes everything from Nest to Roku to some drone controllers.
You're probably better off using something like Samsung's Artik (https://www.artik.io/) that is basically cellphone-grade integration using existing high-density components.
There's a risk/flaw/gap with that plan. Modern consumer electronics depend on humongous volume discounts- 10x or even 100x- on parts, plus tremendous economies of scale on design & assembly.
JIT small-batch assembly seems roughly analogous to other low-volume electronics, which are obscenely expensive. For example, specialty bench equipment like spectrometers can cost as much as a house.. There are many factors at play there, but you can bet one of them is very low volumes.
This is why, at least so far, the kind of play you describe is accomplished with large batches of a generalizable platform that can be sold into many different devices.
At Plethora, we're building the mechanical pieces of this - starting with full-auto, on-demand CNC milling as fast as LuLu or similar, then adding more capabilities.
Happy to make some free parts for anyone's ideas on here!
Ha - yea, we're not the first machine shop, but we're doing something really new in regards to: instant pricing/feedback (inside your CAD), speed, and much more to come
> ...harder to do in startup-sized companies; namely hardware engineering and mass manufacturing innovation.
Don't forget all the hardware companies you know today were once startups. Yes there's a current fashion to call companies that are basically small businesses "startups", but the term encompasses a far more profound class of enterprise.
And Apple itself was founded by an engineer (Woz) who first got his start at a big company, HP, which is celebrated as a hardware startup that grew.
I think the article and your comment do get to an important point, which is that Apple is big enough to have solid (and at the moment very successful) processes and infrastructure so that you only get experience in that matrix, and don't have to spend any time learning stuff outside your own area. You can, of course, and plenty of companies have come from Apple alums.
Clearly hardware start ups aren't impossible, but we are talking about relative difficulty. The article is only comparing them to software companies (Google, Yahoo, Paypal).
I also think its right to compare now with the 1980s when F500 companies didn't know how to compete in the new tech hardware space. Companies like Samsung rush into new product spaces very early in the adoption cycle.
> Don't forget all the hardware companies you know today were once startups.
Unfortunately the winds of VC have changed a lot since then. Originally, VCs were willing and eager to invest millions in a promising innovation. Now they expect you to demonstrate traction before any large investments—which is infinitely harder with hardware startups.
Not only that, but software startups used to not be so cheap either. 15 years ago you need to raise millions just to buy servers. Granted that wasn't the case in the early days of software (Microsoft didn't need servers when they were a startup), but it's an interesting dynamic in the internet age.
You are wrong. At that time the whole universities had, if they were rich, a couple of mainframes for the whole institution, if they were poorer, one or none.
All the students wich are entitled to access to the mainframe worked typically on a single mainframe which was also used for the university accounting etc.
Mainframes were, using the more modern words, the big servers with the capital S, effectively.
Hobbyist keyboard alone (without the TV) had the price of nearly 1000 US dollars equivalent today in 1975
Don't lecture at me, you are not giving me any new information. Yes, having access to a mainframe was incredibly expensive and difficult. Yes, Microsoft leveraged that access.
But were they selling Altair_BASIC running on said mainframe? No they were not, hence they did not have servers. They used a mainframe as a dev box.
You can't redefine my terminology just to suit your own viewpoint.
Really? I get the strong impression (what with people throwing the word bubble around and making regular comparisons to the dotcom era) that VC's are far less picky today then they were 10 years ago.
Hardware startups are still being funded. Software startups are still being funded. Life science startups are still being funded. And it's a lot of the same old line VCs who are doing these deals. I'm thinking $15M-$20M first round, with companies need significant funding but get quite large.
Then there are a ton of small companies, many of which are really small businesses (just look at the list of companies at https://news.ycombinator.com/item?id=9666013), that are getting small amounts of funding ("spray and pray" funds). Are their funders really "VCs" in the classic sense when the amount they invest is =< $100K and they can't participate in future rounds? Although they get a most of the press for "startups" and "VCs", and they are the volume in absolute sense, most of these seem designed to run for a little while and then be aquihired away. So no need to build most of the infrastructure for a sustainable business. In that role, the "VCs" are really more like agents getting a commission on the aquihire.
It's the latter that aren't particularly picky. The traditional VCs seem mostly to still look for the same things.
While true that hardware companies were also once start ups, they were also startups at a different time. Please correct me if I'm wrong, but weren't they start ups when there were no existing hardware companies that operate at the same scale the current big hardware companies do?
Taiwan? I mean, Motorola and even AMD have sold their FABs out to companies who then provide capacity out to the highest bidder. Now they do trail in process, so 14mm might not be available for a couple of years.
I am obviously not Jeff Bezos but I asked myself the same question.
Could such an extreme outsourcing service (not necessarily only fabricating chips) paired with the accessibility of AWS be viable? Does such a company exist?
To some extent, yes, and a few of those companies already exist, with names like Xilinx and Altera. Their FPGA products are available at 22 nm now. But they're not of much use when it comes to selling cheap commodity hardware in volume, of course.
True, and it’s telling that maybe the biggest break-out startup from Apple was Nest. And for that very reason, I’m surprised that there aren’t more creatively stifled Apple hardware engineers pursuing IoT startups.
> Apple does a lot of work in hardware and hardware manufacturing, both areas which are pretty capital-intensive and might not lead themselves as easily to the startup world.
There is some truth to that but it's not the custom chip manufacturing that drives it. It's not even what I guess I would call 'excellence in material design' that drives it. Really it's a combination of the choice to not cut corners on hardware combined with the scale and market position to support the cost.
As an example a lot of has been made in the past about Apple's charges, both laptop and iDevices, they are small, sleek and powerful for their size. There isn't anything special going on there from an electronics perspective, they didn't invent new types of power conversion, they just didn't cut every possible corner to reduce cost and they had the budget to allow custom parts when off the shelf didn't fit. I don't mean custom IC's here I mean custom metal fab, maybe custom caps that are 'normal' other than being shaped a bit differently, etc. Apple has the volume and price point to let them do this when the vast majority of companies don't.
As far as I remember, Nintendo always did that, selling each console at profit, however small that may be, starting from NES, to Super NES, N64, Gamecube and now Wii (and I guess, soon WiiU).
So Nintendo was the only manufacturer who did not apply the "razorblade" model of the other console makers, which is understandable since they were and are a videogame company only and thus never had any other branches which could have been able to subsidize their console business in the beginning (unlike for example Microsoft).
I think the first console Nintendo made that actually sold at a loss is the current 3DS handheld and that may be after the very fast initial price cut after the slow reception on the market.
Als interested to see how it works with different lighting and how dirty the screen can get before recognition fails because the reflection is too „muddled“ ;)