Thanks! I also told Aga via email in the thread where I submitted my article.
Worth noting that the HTML tag in the title was stripped from the PDF table of contents as well, so the title for that article in the contents is missing a word. No big deal, but good to know for future submissions!
Wow, this is jam-packed with interesting information. Thanks for writing it! (Also thanks for all of your other great open source work!)
Are there plans to upstream this into the Zig std library? Seems like it could be useful for more than just the cryptography package, since the benchmarks at the end have it often being faster than std pdqsort. I just checked the issue trackers on Codeberg and GitHub, and didn't see anything mentioning djbsort or NTRU Prime, which leads me to believe there aren't (official) plans to upstream this (yet).
pdqsort is a generic comparison sort. Want to sort employee names, customer email addresses, JSON blobs, or Zebras? No problem, pdqsort just needs an ordering, in both Zig and C++ you write this as a single boolean "less" predicate.
DJB's speed-up relies on vectorization, which works great for integers or things you can squint at and see an integer - but obviously can't sort your employee names, customer email addresses, JSON blobs or Zebras. You could write these branchless network designs anyway but I'm pretty sure they'd be markedly slower, at least for some common inputs.
It’s different every time, but basically “marketing”. No matter where you are showing your stuff, it’s in a subset of the population, chances are HN won’t be buying your app subscription. You need to get it in front of your actual audience.
As someone who has a small bit of skill in marketing nowadays, where would you advice me to deeply learn it? I think I have enough knowledge now to have developed a taste of what is actually good but don't know where to find a good curriculum to get started.
See also "Product Hunt". Oddly it's been about a year since I've noticed anybody who mistakes a Product Hunt launch for a marketing plan but that used to be endemic.
I consider Recursion by Blake Crouch to be similar, even though I liked Antimemetics much better. I haven't read Crouch's other books, but have heard that Dark Matter is better than Recursion, though it may be less similar to Antimemetics.
Nice fork! (I am the person who wrote the original.)
My version is still working well for me, so it's been hard to find motivation to update it. Also, I've been using my increasingly limited free time to work on some exciting new projects, rather than maintenance tasks that feel like a continuation of my day job.
All that to say I'm excited about new repos like yours that take the idea further! I also really appreciate your attention to detail calling out the differences with the original, and that you licensed your version under the GPL.
Hey, I wrote this! There are a couple of reasons that I included the disclosure.
The main one is to set reader expectations that any errors are entirely my own, and that I spent time reviewing the details of the work. The disclosure seemed to me a concise way to do that -- my intention was not any form of anti-AI virtue signaling.
The other reason is that I may use AI for some of my future work, and as a reader, I would prefer a disclosure about that. So I figured if I'm going to disclose using it, I might as well disclose not using it.
I linked to other thoughts on AI just in case others are interested in what I have to say. I don't stand to gain anything from what I write, and I don't even have analytics to tell me more people are viewing it.
All in all, I was just trying to be transparent, and share my work.
Your actor analogy in your other post about AI doesn't really work when it comes to using LLMs for coding, at least. LLMs are pretty good at writing working code, especially given suitable guidance. An actor wouldn't be able fake their way through that.
That's nice to hear. For me personally, I don't really care what tools the author uses to write the article, as long as the author takes responsibility! Yes, that means I'll blame you for everything I see in the article :P
If I recall correctly, he used miniKanren along with formalized, structured data extracted from medical research. Unfortunately, his son has since passed away.
Note that this article is by the same Greg Egan who wrote Permutation City, a (in my opinion) really good, deeply technical, hard science fiction novel exploring consciousness, computation, and the infinite nature of the universe.
If that sounds interesting, I recommend not reading too much about the book before starting it; there are spoilers in most synopses.
You don't necessarily need a background in programming and theoretical computer science to enjoy it. But you'll probably like it better if you already have some familiarity with computational thinking.
Funnily enough I went into it with a background in math and was surprised about one specific claim that I couldn't quite understand, and it turns out it was subtly incorrect in such a way that it actually adds an interesting twist to the story (Greg Egan acknowledged it). I can't quite find the web page with the discussion (ETA: found it, it's the addendum at the end of the FAQ about the book [0]) but it's about <spoilers>the Garden of Eden configuration of the automaton.</spoilers>
ETA: I realize this sounds nitpicky and stickler-y so I just want to point out that I loved the book (and Greg Egan's work in general) and figuring out the automaton stuff was genuinely some of the most fun I've had out of a book.
Note that you can link to pages in a PDF with a hash like #page=64 (for example) in the URL.
https://pagedout.institute/download/PagedOut_008.pdf#page=64
reply