Silicon Chip Magazine ran a competition in 2021 to build a noughts-and-crosses machine based on one that Australian electronics legend Dick Smith built from parts from an electromechanical phone exchange. They ran a series of articles on it, including an electromechanical one of which unfortunately only the first page is available online although it gives you an idea: https://www.siliconchip.com.au/Issue/SC/2024/March/Electrome.... If you can find the articles there's a lot of detail in them on how to do it with minimum circuitry. Someone had also done it with relays a few years before Dick Smith, https://www.vintagecomputer.net/cisc367/Radio%20Electronics%.... There's an even earlier one very briefly mentioned in this 1949 newsreel, https://www.youtube.com/watch?v=SlNxBb_27CA
"In case it is not already obvious, efficiency and sensibility were not a top priority when working on this project. I am sure there are more efficient flip- flop designs or implementations with fewer transistors, especially by building composite gates that combine NAND and NOR gates, but I don't really care :)"
There is absolutely no need for this to exist in physical form. It is perfectly alright to run this on a simulation in Logisim Optimize as long as you desire and then once that is done, implementing it in physical form is just a matter of assembling it, which mostly can be done by many print pcb and solder on demand services.
Looks like a Tiny Analytic transformer, RNN is arguably a better choice if you are gonna handwire an architecture to mechanically do addition. Learning is about discovering the patterns and algorithm from data. Wiring a machine to follow a procedure defeats that purpose.
For a real world use case, you would need an FPGA with terabytes of RAM. Perhaps it'll be a Off chip HBM. But for s large models, even that won't be enough. Then you would need to figure out NV-link like interconnect for these FPGAs. And we are back to square one.
This is new. You are citing FPGA prototypes. Those papers do not demonstrate the same class of scaling or hardware integration that Taalas is advocating. For one, the FPGA solutions typically use fixed multipliers (or lookup tables), the ASIC solution has more freedom to optimize routing for 4 bit multiplication.
I understand that what Taalas is claiming. I was trying to actually describe that model on a hardware is some not something new Or unthought of The natural progression of FPGA is ASIC. Taalas process is more expensive And not really worth it because once you burn a model on the silicon, the silicon can only serve that model. speed improvement alone is not enough for the cost you will incur in the long run. GPU's are still general purpose, FPGA's are atleast reusable but wont have the same speed. But this alone cannot be a long term business. Turning a model to hardware in two months is too long. Models already take quite a long time to train. Anyone going down this strategy would leave wide open field to their competitors. Deployment planning of existing models already so complicated.
Boston dynamics is far behind plus the robots are so cheap , even their dog is cheaper than BD. I dont think their humanoid can even catch up to this price. I am sure US Army and for the chinese counterpart Chinese army will be their biggest customers. But i wonder how will this workout in situations like Plane hijack, fire fighting and other such places where human lives cant be risked to save more human lives. (Please Dont downvote because your american patriotism is poked try replying.)
People will only build it by hand if there is a market for it. Otherwise it'll be mostly hobby software build for learning and entertainment (opensource). Businesses don't mind factory software Although the cost of developing software is going down and will be almost free, in that case, the price is actually of taste. The best designed easy to use Interface. My guess is there will be some people, even in future, who would prefer custom handmade artisanal software written entirely by hand. Those will be probably artisanal software collectors. Maybe there will be some art galleries in future who will auction this software as art pieces to be demonstrated in museums or galleries. These will be rare events. Most commercial software will be factory made.
No, it’s actually the math of overwriting. Imagine you hiked down into a valley Task A and settled there. Then, you decide to climb a new mountain to find a different valley Task B. You successfully move to the new valley, but in doing so, you destroy the path back to the first one. You are now stuck in the new valley and have completely 'forgotten' how to get back to the first one.
not exactly, not at all even in term of the way the llm are trained.
In RL it can be that you are not getting meaningful data anymore because you are 'too good' and dont get anymore the "this is a bad answer" signal so you can't estimate the gradient.
I think the ingredient Silphium described in this dish (Now considered extinct) could be Sea Holly (Eryngium spp). Its highly debated as many authors think it is some extinct variety of fennel, but from the images on the coins it doesnt look like a Fennel.
The best explanation I've heard is that it was a sterile hybrid of two Ferula species. Many Ferula have a long history of mythology behind them. Asafoetida (aka hing) is probably where the heart symbol came from (its roughly heart-shaped root was used as an aphrodisiac).
Silphium similarly had much demand as an aphrodisiac.
This hybrid likely grew in the African Mediterranean and the high demand for it, alongside its inability to reproduce through seed, is probably what led to its extinction.
Could be but the central bulb as made on the coins is unlike a fennel https://en.wikipedia.org/wiki/Silphium , and since this imaginary recipe is a part of a comedy it is unlikely to be edible. If you look at other ingredients they can surely make someone sick.
Don’t worry big companies still can’t copy anything quickly, even with AI. Why? Because before they can ship a single feature, they’ll need to schedule 42 alignment meetings, debate AI-generated slide decks, and log their “strategic pivots” into an AI-curated Jira board.
The real moat isn’t just code it’s speed, focus, user trust, and the ability to actually ship. Those are things bloated orgs struggle with, with or without AI. If you’re solving a real problem and building a real brand, you’re already ahead.
reply