I've always been curious how far we could push "mechanical computation." Seems like even an operation as simple as multiplication requires tons of metal. If I wanted to compute, say, a SHA2 hash or an Ed25519 signature with zero electricity, would I need a room-sized machine?
For sure - at least with the parts in their current form. A simple flip-flop takes up a minimum space of about 30 cm x 30 cm. But I wonder how small these parts could get. Like, what if spintronics was invented in the 19th century instead of the 21st century? Would Moore's law have applied to mechanical transistors?
Since I just now learned about that link, I haven't read the book to know, but I have always been interested in finding out if the ability to create smaller and smaller machines is possible by having an outer machine which manufactures an inner, smaller, copy of itself, apply the process of induction, define the termination criteria, ..., profit!
Or, maybe I'm thinking about the problem all wrong -- it's not the actual construction machinery that's the problem, it's providing the input materials to each step (gears, levers, fasteners, wiring(?), etc)
There's a Factorio-clone hiding in this problem ...
The issue is that scaling does not produce linear effects as you go down (or up) for a number of reasons. What works at the meter scale doesn’t work at the millimeter scale, which doesn’t work at the micrometer scale, etc.
So you end up having to learn an experiment at a more and more difficult to access scale to figure out how to make something actually work.
the reality is that it turned out to be easier to make things with lithography, and we don't need to pantograph our way to the bottom (whew!).
Many cell phones now have sensors that are mems-based, built using lithography (accelerometers being the best example). In many senses, we've started to achieve the goals of the book.
I did enjoy Diamond Age, maybe I should reread it! It was the first time I had ever heard of "reversible computing" and (ahem) I still don't understand it, but it's good to know such a thing exists
I'm about 175 pages into that PDF and am now sorry that I drew attention to it. I was beguiled by the name recognition and the snazzy title, but I find the text filled with hand-wavery and aspirational thinking, and it also seems to focus a lot more on DNA than I would have expected
I also find even their aspirations suspicious that any such machinery could ever possibly exist to just tweezer atoms around like marbles and voila gold from lead!
> I also find even their aspirations suspicious that any such machinery could ever possibly exist to just tweezer atoms around like marbles
We can already push atoms around with macro-scale actuators that have nano-scale accuracy (which is clumsy, to be sure), and there is little doubt that the hardware to do so will get smaller and more capable over time.
Thanks for the recommendation.
That cover though, is that topology specific to a coronavirus or do more viruses share it? Especially since the Pfizer/Biontech and Moderna vaccines deploy nano-particles for delivering their payload.
I was blown away (no pun intended) by the mechanical analog computers used fire control systems on battle ships:
https://arstechnica.com/information-technology/2020/05/gears...
This was 3000 pounds for calculating a shell trajectory (with a good number of parameters).
If you could build mechanisms atom-by-atom, you could make reversible mechanical computers that are orders of magnitude faster than what we have today.
Rod logic will not be faster than electronic computers. According to Drexler's thesis, it's reasonable to expect "that RISC machines implemented with this technology base can achieve clock speeds of ~ 1 GHz, executing instructions at ~ 1000 MIPS."
This is because the speed of sound, which limits how fast mechanical signals can propagate, is much lower than the speed of light.
The main advantages of rod logic is that its compact and power efficient. The aforementioned CPU would consume ~100 nW.
Really the reason why Drexler analyzed rod logic in the first place is that it was easy to analyze and something that his proposed assemblers could plausibly construct, better alternatives for fast computing may exist.
> This is true, but it's important to consider that you could squeeze several billion of these processors into the space taken up by current CPUs.
You're implying that parallelization can make up for the slower clock speeds, which is true but only for some workloads, and then the system is constrained by bandwidth to get instructions and data to the parallel cores as fast as possible.
Sorry that HN's software rate limited your account! New accounts are subject to a few extra restrictions, and it always makes me sad when a project creator shows up and gets hit by those (I'm a mod here). That's not at all a case that we're trying to restrict!
I've marked your account legit so this will not happen to you again, and I've approved your comments that got throttled, so they're up now. Welcome to HN and congratulations on this exceedingly cool work.
seeing that nynx hinted at reversible computing, they would just be smaller and more energy efficient. The idea being that you can cram more of these in a given volume.
Reversible computing tries not to destroy information, allowing to go under Laundauer's limit [1].
When you discard the previous value held by your flip-flop, you clear the output bit, returning electrons (or a chain displacement) to the power supply. If you can instead repurpose that energy, you'll have to supply a lot less energy since you'll dissipate less. That would be reversible or adiabatic computing [2]. I have to note that processors these days are mostly power-limited, trying not to melt themselves as the energy flux inside a chip approaches that of a nuclear reactor. Just look at modern sockets and count the pins dedicated to power supply![3]
Building at the molecular scale you can achieve extremely low friction coefficients in the moving parts. Inertia also gets extremely low, and material strengths tend toward their theoretical values.
Of course electronics aren't standing still, but resistance tends to get harder to deal with as feature sizes decrease.
What I've always wondered is that wouldn't very tiny molecular mechanisms get problems with "accidental welding" since a part could be permanently destroyed by a few molecular bonds forming or breaking and (IMHO - this is my guess/assumption) such events would be likely at e.g. room temperature.
Unless designed well, yes. Parts that move relative to each other need to be designed so that unwanted bonds are unlikely to form. This generally means designing them so that unwanted bonds are less energetically favorable than the bonds they start out with. Of course, as temperature rises, the chance of breaking existing bonds rises, as does the chance of forming new unwanted bonds.
You can hold a mechanical calculator in your hand, so I imagine if an industry of effort on perfecting mechanical computation, it could get quite small: https://en.wikipedia.org/wiki/Curta