Hacker Newsnew | past | comments | ask | show | jobs | submit | jshaqaw's commentslogin

This is interesting but also just hilarious at a meta level. I was a “low frequency” ie manual fundamental based hedge fund investor for many years. In general I think hft is a net benefit to liquidity when done in compliance with the text and spirit of regulations. But no real world allocation of resources is improved by having to game transactions to this level of time granularity. This is just society pouring resources down a zero sum black hole. Open to hearing contrary views of course.


I've been wondering if the stock market would be more efficient if trades executed only every <small time interval> instead of continuously, i.e. every 1 second an opening trade style cross book clearance happens. Orders would have to be on the book for a full interval to execute to prevent last millisecond rushes at the end of an interval

I'm probably missing some second order effects but it feels like this would mitigate the need for race to the bottom latencies and would also provide protection against fat fingered executions in that every trading algorithm would have a full second to arbitrage it


You could do this but the cost would be wider bid/ask spreads for all market participants. If you make it harder for market makers to hedge their position, they will collect a larger spread to account for that. A whole lot of liquidity can disappear in a second when news hits.

I’d rather have penny-wide spreads on SPY than restrict trading speed for HFTs. Providing liquidity is beneficial to everyone, even if insane amounts of money are spent by HFTs to gain an edge.


It's really binary events that they should throttle execution and do batch orders.

The bad part of HFT is paying the smartest young minds this country has to offer to figure out how the parse GDP data as fast as computationally possible so they can send in an order before other players can. That's a dumb game that doesn't provide much benefit (besides speed in sparse critical moments adding a few % to the funds ROI).

They can arbitrage all day, but don't let them buy every Taylor Swift concert ticket the moment it goes on sale because they have a co-located office with a direct fiber line, ASIC filled servers, and API access.


Would be interested to see real numbers around societal value from marginal added liquidity versus aggregate spend into the zero sum arms race.

I have also seen enough to be quite sure that many hft strategies are quite normie investor predatory.

Again, I’m not zealot. I trade stuff. I love liquidity. I’m happy to pay someone some fraction of a penny to change my mind. Service provided. But the returns from vanilla liquidity provision commoditized long ago to uninteresting margins. That leaves a lot more of the hft alpha pool in the predatory strategies and capital flows where the incentives are.


this is exactly what many dark pools do

"continuous periodic auctions"


Turbo Pascal was such an amazing product for its time. Those of young ones may not realize how horrible life was before Borland nailed IDEs - especially on home systems which didn’t exactly have advanced terminal capabilities.


My first neural net code (very very bad) was on a 286/287 back in the early 90s. 286 is kind of a forgotten chip since 386 32 bit changed the game but give a kid (ie teenage me) a 286, a 40mb hard drive, and Turbo Pascal and I felt like I could build anything!


Did you run TP in the '87 mode?


I love the spirit of this so believe me I’m not here to denigrate it. Just sharing anecdote of building one of those “retro style” Linux/pi kits with my kids some years ago. I thought having a bare metal style machine would get them entranced by computers like it did in the early 80s. But you can’t roll back the clock. Kids today live saturated in a computerized and digital world no matter how much we may try to shelter them from it. When I saw a bare DOS prompt at age 7 it felt like all the magic in the world at my fingertips. But not for my kids. What once was science fiction to me is no more amazing to them then the fact that we have 24 hour indoor electric lighting at the touch of a switch. It’s just a different world today.


Creator here!

You're absolutely right! I wasn't there either, and i wouldnt want a terminal-only machine.

My goal is providing means and knowledge to those that want it. The Ashet is more meant for technical schools, apprenticeship and "the interested", and definitly requires an initial spark that will make that project interesting to you.

It's here to fill a (percieved) gap and wants to sit more in the professional education part than the "get the kids to computing".

I'd love to have had such a hands-on device in school instead of the bare theory of circuit operation and basics of programming.


Retro lisp machines are cool. Kudos to the team. Love it.

That said… we need the “lisp machine” of the future more than we need a recreation.


> we need the “lisp machine” of the future

Totally agree.

Here's my idea: stick a bunch of NVRAM DIMMs into a big server box, along with some ordinary SDRAM. So, say, you get a machine with the first, say, 16GB of RAM is ordinary RAM, and then the 512GB or 1TB of RAM above that in the memory map is persistent RAM. It keeps its contents when the machine is shut off.

That is it. No drives at all. No SSD. All its storage is directly in the CPU memory map.

Modify Interim or Mezzano to boot off a USB key into RAM and store a resume image in the PMEM part of the memory map, so you can suspend, turn off the power, and resume where you were when the power comes back.

https://github.com/froggey/Mezzano

https://github.com/mntmn/interim

Now try to crowbar SBCL into this, and as many libraries and frameworks as can be sucked in. All of Medley/Interlisp, and some kind of convertor so SBCL can run Interlisp.

You now have an x86-64 LispM, with a whole new architectural model: no files, no disks, no filesystem. It's all just RAM. Workspace at the bottom, disposable. OS and apps higher up where it's nonvolatile.

I fleshed this out a bit here:

https://archive.fosdem.org/2021/schedule/event/new_type_of_c...

And here...

https://www.theregister.com/2024/02/26/starting_over_rebooti...


What does a Lisp Machine of the future look like?

There is Mezzano [1] as well as the Interlisp project described in the linked paper and another project resurrecting the LMI software.

[1] https://github.com/froggey/Mezzano


> What does a Lisp Machine of the future look like?

Depends on what one means by that.

Dedicated hardware? I doubt that we’ll ever see that again, although of course I could be wrong.

A full OS? That’s more likely, but only just. If it had some way to run Windows, macOS or Linux programs (maybe just emulation?) then it might have a chance.

As a program? Arguably Emacs is a Lisp Machine for 2025.

Provocative question: would a modern Lisp Machine necessarily use Lisp? I think that it probably has to be a language like Lisp, Smalltalk, Forth or Tcl. It’s hard to put into words what these very different languages share that languages such as C, Java and Python lack, but I think that maybe it reduces down to elegant dynamism?


> Provocative question: would a modern Lisp Machine necessarily use Lisp?

Seeing that not even "Original Gangster" Lisp Machine used Lisp ...

Both the Lambda and CADR are RISCy machines with very little specific to Lisp (the CADR was designed specifically to just run generic VM instructions, one cool hack on the CADR was to run PDP-10 instructions).

By Emacs you definitely mean GNU Emacs -- there are other implementations of Emacs. To most people, what the Lisp Machine was (is?), was a full operating system with editor, compiler, debugger and very easy access to all levels of the system. Lisp .. wasn't the really interesting thing, Smalltalk, Oberon .. share the same idea.


> Dedicated hardware? I doubt that we’ll ever see that again, although of course I could be wrong.

Since we're now building specialized hardware for AI, emergence of languages like Mojo that take advantage of hardware architecture and what I interpret as a renewed interest in FPGAs perhaps specialized hardware is making a comeback.

If I understand computing history correctly, chip manufacturers like Intel optimized their chips for C language compilers to take advantage of economies of scale created by C/Unix popularity. This came with the cost of killing off lisp/smalltalk specialized hardware that gave these high level languages decent performance.

Alan Kay famously said that people who are serious about their software should make their own hardware.



Mostly dead. Current Lisp Machine shenanigans related to MIT/LMI are at https://tumbleweed.nu/lm-3 ...

Currently working on an accurate model of the MIT CADR in VHDL, and merging the various System source trees into one that should work for Lambda, and CADR.


> Currently working on an accurate model of the MIT CADR in VHDL

Sounds extremely interesting, any links/feeds one could follow the progress at?

The dream of running lisp on hardware made for lisp lives on, against all odds :)


Current work is at http://github.com/ams/cadr4

And of course .. https://tumbleweed.nu/lm-3 .


Maybe try replacing the ALU with one written directly in Verilog, I suspect this will run a lot faster than building it up from 74181+74182 components.


From what I see -- that is not the case.

The current state is _very_ fast in simulation to the point where it is uninteresting (there are other things to figure out) to write something as a behavioral model of the '181/'182.

~100 microcode instructions takes about 0.1 seconds to run.


I was thinking more of a behavioral model of the whole ALU, just so that the FPGA tools can map it onto a collection of the smaller ALUs built into each slice.

What clock speed does your latest design synthesize at?


At the top of the readme it says "There will be no attempt at making this synthesizable (at this time)!".


There was already a design of CADR for FPGAs [1] that does synthesize (and boot), I don't know why amszmidt needed to start again from scratch or if his design is a modification of the earlier one.

A similar comment applies to lm-3. Maybe it is built on a fork of the previous repo, it is hard to tell.

[1] https://github.com/lisper/cpus-caddr


Smalltalk was the lisp machine of the future. Of course, now even Smalltalk is a thing of the past.


Agreed. People point to the edge cases where orthodoxy and conformity crushed the brilliant innovator but for every one of those there are 1000 cranks or embittered dead end researchers claiming that only a conspiracy prevented their genius from being recognized.


This sort of argument implies that Riess must have turned into a crank since his Nobel prize winning turn... Seems unlikely though doesn't it?


Happens so often they call it the Nobel curse.

It’s related to why famous celebrities and billionaires often lose their minds. Human success is self limiting. It begets failure by convincing people they are “special.” Pride comes before a fall is not because god punishes people for pride or something. It’s a statement of cause and effect. Pride causes a fall.


I crisis in quality and also a crisis in innovation. How many years has it been since a software innovation which matters to end users? It’s been new emojis for a decade.


You would think these tech oligarchs might be concerned with some other problem facing th world. This is the only one they seem to care about and much of it just seems like fragile ego.


There was a subset of us “old” 90s nerds who failed to take certain elements among us who would be: 1. Enriched an empowered by having the right skillset at the right place and time to achieve fortunes (and by direct purchase) political power unrivaled since the Gilded Age and 2. Still traumatized by not being at the cool kids table in middle school, never emotionally progress past being 12 year old boys

We didn’t need the bullies and autocrats to discover technology. They were among us the whole time. We just didn’t take them seriously.


Lots of finance people from an earlier era ended up there from the movie Wall Street which is hilarious since it was not intended as something glorifying the sector


To be fair, Jurassic Park also doesn’t present cloning dinosaurs as a safe career path.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: