Hacker Newsnew | past | comments | ask | show | jobs | submit | RaftPeople's commentslogin

The guy did seem to be wearing a warm coat and gloves so obviously pretty cold where the video was, probably just below the frost line.

Central Point Software, the makers of Copy II PC, was one of our customers (we created back office software, order processing etc.).

It was a pretty healthy business, not just for the copy protection breaking but also the general tools software.

Funny story:

I was at their offices working on a project when they were getting ready to ship out the new version. Their warehouse was connected to the office building and they were producing all of the final copies and loading them on trucks to get sent to the distributors.

In the morning they gave the all clear for the first wave of trucks to leave, then about 4 hours later someone found a bug and they had to call all of the trucks back to the warehouse, unload, re-create new clean product etc.

They did this about 3 times before that version finally made it to the distributors.


So did I. We joked that BPCS stood for: Better Programs Coming Soon

It was actually a well designed and functional system, just too many bugs.


Researchers also know that astrocytes are active participants in computation for some areas that have been studied (vision and also memory).

The article seems about 5 to 10 years late.


Ya, RPG assumed character based IO so probably a safe bet that they just ported stuff that ran on IBM character based terminals and just made it run in DOS. (I worked in RPG in the 80's)


> The cells in your eyes have exactly the same DNA as the cells in your big toe

Is that true?

I know that cells in the brain have significant variability in DNA, but not really aware of what non-neuronal and non-brain cells in general typically have.


Every cell in your body (excepting red blood cells) has a complete copy of your genome. What differs is which portions are activated.


Except for in the brain (13% to 41% of neurons with variation, deletions, additions, etc., first discovered in 2001, study below from 2013 confirmed).

https://www.science.org/doi/10.1126/science.1243472


I liked that first one and I hope someone creates one of going back to dinosaur age, i want to see that.


One step closer to the science-based dinosaur MMO we were promised.


Tim is awesome.

Ironically, he covered PixVerse's world model last week and it came close to your ask: https://youtu.be/SAjKSRRJstQ?si=dqybCnaPvMmhpOnV&t=371

(Earlier in the video it shows him live prompting.)

World models are popping up everywhere, from almost every frontier lab.


> Kent Beck, credited with coining the term "unit test"

The term "unit test" has been used since the 1960's (if not earlier).


Definitely not. The Art of Software Testing wrote about "module testing" in 1979, which was revised as "unit testing" in a later revision after "unit test" was part of the popular lexicon. Perhaps that is what you are thinking of?

Beck is clear that he "rediscovered" the idea. People most certainly understood the importance of tests being isolated from each other in the 1960s. Is that, maybe, what you mean?


Link to ACM paper from 1969

https://dl.acm.org/doi/epdf/10.1145/800195.805951

Quote:

"Unit test:

testing, outside of thesystem, of a part of the system thatmay have less than a complete function.

Component test:

testing, inside of the system, of parts that have been successfully unit tested.

Integration test: testing of new components to ensure that the system is growing functionally; in addition, retesting of successive system builds to ensure that they do not cause re-gression in the capabilities of the system.

Regression test: testing of the system for unpredictable system failure.

Scaffolding:

coding that replaces not - yet - completed functions..."

I was taught in the 80's about this by a woman that had worked at some company that was more formal about software dev than we were at the small company I worked for.


> you're trying to twist reality to fit a premise that is just impossible to make true: that estimates of how long it takes to build software are reliable.

It's not binary, it's a continuum.

With experience, it's possible to identify whether the new project or set of tasks is very similar to work done previously (possibly many times) or if it has substantial new territory with many unknowns.

The more similarity to past work, the higher the chance that reasonably accurate estimates can be created. More tasks in new territory increases unknowns and decreases estimate accuracy. Some people work in areas where new projects frequently are similar to previous projects, some people work in areas where that is not the case. I've worked in both.

Paying close attention to the patterns over the years and decades helps to improve the mapping of situation to estimate.


Yes, but where reliability is concerned, a continuum is a problem. You can't say with any certainty where any given thing is on the continuum, or even define its bounds.

This is exactly what makes estimates categorically unreliable. The ones that aren't accurate will surprise you and mess things up.

In that sense, it does compress to being binary. To have a whole organisation work on the premise that estimates are reliable, they all have to be, at least within some pretty tight error bound (a small number of inaccuracies can be absorbed, but at some point the premise becomes de facto negated by inaccuracies).


> completely missing some elegant and profound beauty.

Requires some dynamic SQL to construct, but the beauty is that you can use the SQL engine for this solution:

select top 1 *

from (select

t1.id,t2.id,...,tn.id

,sum(t1.cost+t2.cost...+tn.cost) as total_cost

from join_options t1

cross join join_options t2

...

cross join join_options tn

group by t1.id,t2.id,...,tn.id) t0

order by

t0.total_cost


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: