From the article: But, increasingly, I feel less and less like an outsider. It’s not me. It’s people living for and by advertising who are the outsiders. They are the one destroying everything they touch, including the planet. They are the sick psychos and I don’t want them in my life anymore.
My thoughts exactly. Like the author, I don't have patience for glossy websites anymore. If a site needs third-party JS to be usable then I'm gone. I won't put up with 15mb downloads just to read some content (Twitter included). The web is not so much splitting in two. We are simply saying "no more" to the exploiters and grifters. Eventually, they will have to learn to eat with the commoners or eat by themselves.
I’m moving to Firefox as my main browser, and the amount of sites that break because of me blocking JavaScript (with ublock) is staggering. I can understand SPA being JS only, but simple blogs and other content sites, no.
Using NoScript has been pretty eye-opening. I've come to learn that most of the time I'm happy unblock the primary domain. If the site doesn't work I'll look for a clearly-named CDN. If I can't identify one or, worse-yet, there is some kind of cdn.some-domain.com and unblocking that still doesn't make the site work, then I'm out.
I've stopped using NoScript because it was breaking my own apps. It was fine until Mozilla changed something in Firefox. The problem is that NoScript inserts a lot of JS into the web page and some of that JS gets broken by some sort security lockout. It may be a bug in Firefox and maybe it's been fixed but I wasn't satisfied with NoScript anyway. I'm now using uMatrix. It doesn't insert JS on every active element (just at the top) so it avoids buggy behaviour. The main thing I like about it is that it only blocks cross-origin scripts (NoScript blocks everything by default, including same-site).
I do not understand why this injecting js to block js is a thing. Why is there no way to just tell the browser engine not to execute any js here? if the functionality isn't there in the browser engine, why not?
I use 2 browsers, one with JS disabled and blocking everything (like 95% of the time), and second one for say "important" stuff like banking, youtube and facebook ;)
Good to know! I have seen how it injects JS into every page but so far it has not messed with my development, but I do use uBlock so I'll look into using it as a NoScript replacement as less the fewer plugins the better. Thanks!
I do also use my hosts file for stuff that will never, ever, ever get a pass, like google analytics.
You clearly haven't seen the sites that demand JavaShit just to load JPGs and PNGs that you told the browser to open directly. As in pointing the browser at a JPG or PNG and getting redirected to a page with JavaShit to load that JPG or PNG.
Why? Fuck if I know, I don't live in the same universe as whoever wrote and published that literal garbage.
I recently got a CD-rom from a hospital containing X-rays of my leg to share with my doctor. Upon finding a computer with a functioning optical drive I was dismayed to discover that the CD-rom contained not 3 simple images of the bones in my leg, but almost 400 megabytes of impenetrable binary data in over 30 nested directories and an .exe called MedicalImageViewer which I was unable to run on the latest version of Windows in my posession (win7) due to missing visual basic related dlls.
I was eventually able to run the app on a friend's computer, but finding no in-built option to export the images I was forced to take screen grabs to email to my primary care provider.
Linux has medical viewers. You can get the Ubuntu Mate 22.04 LTS DVD, install it.
Open Synaptic, which is the "advanced graphical package manager", search for "DICOM" in the dialogs and install a good viewer.
Guess what your doctor does to look at your X-rays: The chance is high that he'll just execute the .exe file on the CD-Rom too. I've witnessed that already multiple times and always cringe, because of the obvious security implications. It'd be so straight-forward to compromise a doctors office by handing them over a different CD-Rom with malware instead.
The one I worked for also included a viewer on discs burned for distribution (typic'ly to patients, but could go elsewhere if patient signed the right document), but we used a couple different viewers internally. If we received a disc, that DICOM data was imported using our viewers, never whatever was provided on the disc. Local network traffic was…closely monitored.
Occasionally, a remote doctor (not from our office) would call for help with the viewer we provided on disc. Usually, because some advanced feature they could have used at their office didn't exist in the patient viewer, or worked differently.
Normally, instead of discs, we just transceived images via PACS, or accepted physical films to be scanned into our PACS.
A lot of paywalled websites won't throw up a nag screen and will let you read the whole article if you turn off javascript. You won't see the pictures (probably because they want to discourage people from turning off JS) but that's fine with me.
But recently installed duckduckgo on my android phone and turned on their app tracking protection. The unnecessary requests made by apps I haven't used in months is jaw dropping.
> I won't put up with 15mb downloads just to read some content
Meanwhile my normie friends are sending me TikTok videos of people reading screenshots of Reddit comments posted to Instagram. Regular users absolutely do not care about any of this.
That's a them problem. 30 years ago we knew they'd be trouble if they ever adopted the web en masse and they (and the folks that prey on them) have proven that prediction right at every turn.
Once it is be accepted that "normal peoples" need to ingest advertising ad nauseam and each of their movement is being tracked, people who actually value their privacy and a content web become the abnormal peoples.
The author mention the dark web, this is exactly what is happening. If you value your privacy you are now assimilated to criminality and terrorism.
“All members contacted adopted a clandestine behaviour, with increased security of means of communication (encrypted applications, Tails operating system, TOR protocol enabling anonymous browsing on the Internet and public wifi)”.
“All members of this group were particularly suspicious, only communicating with each other using encrypted applications, in particular Signal, and encrypting their computers and devices […].
"The elements of the investigation that have been communicated to us are staggering. Here are just some of the practices that are being misused as evidence of terrorist behavior6:
– the use of applications such as Signal, WhatsApp, Wire, Silence or ProtonMail to encrypt communications ;
– using Internet privacy tools such as VPN, Tor or Tails7 ;
– protecting ourselves against the exploitation of our personal data by GAFAM via services such as /e/OS, LineageOS, F-Droid ;
– encrypting digital media;
– organizing and participating in digital hygiene training sessions;
– simple possession of technical documentation."
Now using encryption and secure messaging with anything but whatsapp and imessage marks you as a potential terrorist, later it will be ad blockers, disabled javascript, not using chrome's WEI and using the smol net/web.
I couldn't really process your comment because you rounded up to 30. It got me thinking, there was such a huge gulf between 1993 and 1996, at least for me. 1993: derping DOS on a 386; 1996: Windows 95, Quake, dialup, the Web. The pace of change might seem rapid these days, but the 90s were insane fast.
Maybe it's just my perception of things, but the changes back then were exciting and usually made things better for users. Today changes makes my user experience worse, with spyware on top. I'm exaggerating a bit, but you get my drift.
There's nothing unapproachable about a 100% static site with some server-rendered navigation elements, much like livejournal was in the early 2000s. Much like HN is today.
Perfectly 'approachable' even on mobile devices.
The only thing that makes sites like this and that 'unapproachable' is stunningly high (genuine, not stat-massaged) levels of functional illiteracy.
Part of the promise of the web is democratized expression. So, not just approachable for people accessing content, but for people wanting to share. From their phone or tablet or XR device, even, since those are the computers people spend most of their time with. IME, "deploying" (I hate that word, we need another) from one of those devices is not indicative of functional illiteracy, but the lack of an approachable solution. Even just putting the instructions on a simple webpage instead of in a github readme would be a good first step.
Also that unless you play the stupid SEO games, your approachable site is never going to rank on Google. And as we know, the normies won't even scroll the search page, let alone go to page 2 (or use a better search engine). So they'll only get whatever scammy ridiculous clickbait the SEO agencies have put together for the corporate sites so they can rank for that keyword (and not even relevant keyword - ranking for any keyword is "good" now).
For people who are accustomed to clicking "new post" there's definitely something unapproachable about mediating the relationship between your DNS registrar and your hosting provider.
I'm just saying that if it's gonna be a techies web and a non-techies web we ought to recognize that some of us tend toward toxic elitism and work to counteract it.
As much as I enjoy being a Linux snob and a person with Stack Overflow reputation and a pedantic nitpicker on Wikipedia, I don't think that those parts of me, left unchecked at scale, make for a nice community. I want other kinds of specialists around and I don't want them to feel like second class citizens because they don't know their way around a private key (or, if that's unavoidable, I want to have at least tried to teach them).
This is an honest question, please treat it as such. What is toxic about the notion that some things just aren't for everyone and that's probably ok? Is there a moral or ethical requirement to pitch to the lowest common denominator and if so why?
There's nothing toxic about acknowledging that some things are not for everyone.
I'm specifically interested in the ability to share your ideas with your peers in a way that resists tampering by third parties, provided you're willing to put in a reasonable amount of work re: learning to do so.
If we let that be the mark of a privileged class, then we might as well go back to the middle ages.
They're victims of capitalism gone-astray, not traitors. If we abandon them we'll only have each other to talk to. I don't like us enough to do that.
Besides, we have to deal with the consequences of their votes, so it's in our interests to oppose whoever would manipulate them because those manipulators are threats to things that we care about. Things like privacy, and the ability to control the devices that we "own" in ways that contradict the vendor's wishes.
When i started to get into computers and later the internet my normie friends thought of that stuff as utterly pointless. Sure, they were sure that there was some use to computers in big businesses, replace the typewriters and make the life easyer for accountants, but everything else? Nah, nothing interesting for the wider populous, just nerd toys.
The internet? Why should anyone want to write an email if you could simply call somebody? Why would i want to get in contact with some random person on the other side of the globe? Why order something online? Is the "Quelle Katalog" (something akin to Sears) not much better?
Well... nearly all of the normie friends that did make such statements are now happily online, albeit nearly entirely on the corpo-web. My take on this: Let them. Let them sell their lifes to the corporations, but give them the opportunity to break free and learn of the "other web" if they show interest. Freedom is also the freedom to harm yourself.
The corpo-web, I really like that term, it perfectly encapsulates the soullessness of the modern web. Is that term from something or did you come up with it yourself?
To be honest: I have no idea. It may be that i have read it in some cyberpunk story or in some gopherhole... but i have it in use for the last couple of years, so who knows? ;-)
Get Lagrange in PC/Android, head to gemini://gemi.dev, go to The News Waffle and paste the whole URL (whole, as with https://www...) there in the URL input form.
It works for sites and for news home pages. It detects RSS feeds, too, putting that option in the first place when you get the rendered page.
It can cut down most pages down to 5% and less.
More than not having patience for the glossy websites anymore, I find myself trusting them less too. If I visit a company website and it's covered in pictures, big shiny buttons, flashy menus, and those damn "helper" bots, there's a solid chance it hasn't been updated in a few years and won't have any useful information.
I just want unified reader mode experience for content oriented sites for consistent experience. I wouldn't mind some sort of AI constantly interpreting browser content in background and split out basic website in the foreground even if it involves wasteful amounts of computation. The way things are going, that's probably how to ensure adblock in the future.
I am longtime Firefox faithful. But I keep extension usage down to minimum- multi account container, ad block for YouTube, leechblock, clear url, zotero and fpl tools (fantasy football) right now. Tbh Firefox rarely bothers me nowadays with resource usage. I am confident I haven't had to restart Firefox since last November.
My work laptop is a Lenovo P52 with 32GB of ram and win10. Home laptop is a Lenovo legion with 16GB of ram and win11. I had so many issues with performance in the past that I just trained myself to never keep resource-hogging stuff running in background that I am not actively using.
I regularly have zoom and Firefox running while playing MMO games on home laptop. Firefox, SQL server etc. running in background while running multi-core data analysis on work laptop. I never had issues with Firefox, and if there's a memory crash I usually know what caused it. But it is down to hardcore discipline using the machines, and I understand it is not usual patterns for others.
Using Pale Moon with 110 extensions (the original powerful XUL kind, not the lame Web Extensions that modern Firefox copied from Chrome), memory usage rarely crosses a gigabyte on a 12 GB Linux laptop from 2015 with 11 tabs open; currently at 800 MB.
Meanwhile, Floorp - a rebuild of contemporary Firefox - with 9 extensions and 4 open tabs - takes more than double that.
My thoughts exactly. Like the author, I don't have patience for glossy websites anymore. If a site needs third-party JS to be usable then I'm gone. I won't put up with 15mb downloads just to read some content (Twitter included). The web is not so much splitting in two. We are simply saying "no more" to the exploiters and grifters. Eventually, they will have to learn to eat with the commoners or eat by themselves.