Hacker Newsnew | past | comments | ask | show | jobs | submit | hotstickyballs's commentslogin

That directly goes against the earlier post where you said you lived in a particularly nice part of town

London has a very high ratio of extremely nice houses on a road opposite council houses, or former council houses. There can often be a very large mix of housing in one area.

FWIW ancient Rome was also like this, and for example in Pompeii you can find extremely fancy houses with frescoed dining rooms right next door to single room hovels. They didn't have subways or mobile phones though.

It seems to me to be quite the feature ... well, everywhere except the USA. Certainly all over Europe, one finds this mix. In the USA, you generally only find cheap/low quality/small housing stock adjacent to expensive/high quality/large housing stock where there's some municipal or other border, and the two just can't avoid being where they are.

There are a good number of places in the world where people of varied incomes live relatively close.

London tends to get that because it has never really been planned. It just grew over the course of 1600 years and absorbed other areas as it went. There are plenty of areas where a row of £20m+ homes are opposite blocks of 3-bed flats that go for a hundredth of that price.

Hundreds of years ago, before the rail or underground network, you still needed plenty of working class to live near where the rich people lived as the rich people still needed shops, servants, etc.

Having the city split into individual boroughs means that each borough had to provide for the full economic spectrum. The really expensive boroughs still have plenty of social housing and arbitrary divisions of land mean that things but up against each other from different boroughs.

However, new developments don't always get it right, when big green-field or brown-field sites are converted to residential they often struggle to get the correct right, and you end up with bigger areas that only cater for a subset.

National planning laws are also circumvented or gamed. If a new site requires a certain percent of "affordable housing" the developers will often agree (with the local borouhgh/council) to roll that over with another couple of projects and then build most of the "affordable housing" all in one place, and the diversity of individual areas is diminished.

As you say, there are plenty of other places in the world where this is the case, most of them in countries/cities that have existed for hundreds or thousands of years.


> London tends to get that because it has never really been planned.

Not particularly true.

Go to a city in the world that REALLY has "never been planned", then come back and tell me London hasn't. ;)


If you're gonna do the No True Scotsman thing, the least you could do is give an example of a "REALLY" unplanned city.

Favelas

I am not sure why you would want that however.

A better society.

Imagine living somewhere that people who work service or retail jobs (or nursing or teaching or all manner of underpaid but essential professions) can also afford to live!


Even people with highly paid jobs living in nice areas commute to work.

Because it means that you don't get areas of extremes. (well not as much.)

It also means that local services can't be compartmentalised so that only rich people get decent services.

For example, southwark uses the same police force to cover the southbank (cultural centre) the £5m apartment blocks, as well as the shithole council estates (well they aren't shitholes anymore.)

TLDR: you don't get no-go areas.


TVs tend to incessantly ask for internet access, especially android ones.


Then don’t buy an Android tv?


The problem with 'well just don't buy it' is that in many product categories, enshittification has become so entrenched that there are no longer options to avoid it. The availablity of product features is driven by market forces, if it's no longer profitable to sell a TV that doesn't require online connectivity for the purposes of ads, then such TVs will no longer be sold.

Alternatives like using monitors designed for digital signage come with drawbacks. Expense, they don't have desirable features like VRR, HDR or high refresh rates, since they aren't needed for those use cases. Older TV models will break and supply will dry up.

In the long term, this problem, not just TVs but the commercial exploitation of user data across virtually all electronic devices sold, isn't something that can be solved with a boycott, or by consumers buying more selectively. The practice needs to be killed with legislation.


Good point. I’ll just argue about HDR and high frame rates being desirable features :) I don’t even know what VRR is.


VRR is Variable refresh rates, so if there is nothing going on in the content, they can bring the refresh rate down and save processing, thermal issues and energy. If there is a lot going on(say a game), they can ramp the refresh rate back up super high.

There are a few different "standards" around VRR, not every device supports all of them.


Meh, I wonder why I care about saving energy or processing on a tv that’s plugged in anyway but hey. Thanks for explaining!


Their explanation of the reason for VRR is bad. The primary reason people want it is gaming where the game is not locked to a specific frame rate. Without VRR, the timing of a frame being delivered isn't necessarily going to match when the display is expecting a new frame. This leads to one of two effects. Either the display is forced to hold an old frame for longer and pick up the new frame on the next refresh cycle, which creates stutter. Or the display switches which frame its using partway through the refresh cycle, which creates a visual tear in the image.


Turns out the problem with mobile was windows all along!


Personally it felt quite natural once you start to work on real software projects.


I have never felt that way, and I’ve worked on a variety of projects at a variety of companies.

Everyone has a bespoke mishmash of nonsense pipelines, build tools, side cars, load balancers, Terragrunt, Terraform, Tofu, Serverless, Helm charts, etc.

There are enough interesting things here that you wouldn’t even need to make a tool heavy project style software engineering course - you could legitimately make a real life computer science course that studies the algorithms and patterns and things used.


Merging strategies, conflict resolution, bisect debugging, and version control in general are very computer sciencey.

Would make a great course.


That’s not even getting into the fact that you could basically teach multiple graduate level distributed computing courses with k8s as a case study


That's the job of a (good) manager.


Main Street sells pizza but VCs sell Ponzi schemes


This!


The biggest assumption is that it assumes only a single path to intelligent life.


I disagree with the characterization of structural corruption. Every rationale actor will seek to capture all the benefits and pass on the risks. The real corruption is when decision makers know that they can’t be held responsible through corporate or political structures. See also [moral hazard](https://en.m.wikipedia.org/wiki/Moral_hazard)


If compute power is the deciding factor server vs edge discussion then we’d never have smartphones.


I’m willing to sacrifice your rights if it means that there’s less incentive to steal my phone


why do you think you have any say over others' rights? using that same logic, you know what? i think you're going to steal my phone. so do you mind if i sacrifice your rights and install a camera right in your room? wouldn't want you to plot the theft of my phone now would i


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: