Hacker Newsnew | past | comments | ask | show | jobs | submit | AlphaAndOmega0's commentslogin

GPS jamming for incoming drones?


>Interactive Human Simulator is a bold way to describe spinning up a few GPT calls with mood sliders, but sure, let’s call it anthropology. Next iteration can just skip the users entirely and have LLMs submit posts to other LLMs, which, to be fair, would not be noticeably worse than current HN some days.

My sides


Humans with certain amnestic syndromes are incapable of learning. That doesn't make them unintelligent or incapable of thought.


>If anything the agentic wave is showing that the chat interfaces are better off hidden behind stricter user interface paradigms.

I'm not sure that claim is justified. The primary agentic use case today is code generation, and the target demographic is used to IDEs/code editors.

While that's probably a good chunk of total token usage, it's not representative of the average user's needs or desires. I strongly doubt that the chat interface would have become so ubiquitous if it didn't have merit.

Even for more general agentic use, a chat interface allows the user the convenience of typing or dictating messages. And it's trivially bundled with audio-to-audio or video-to-video, the former already being common.

I expect that even in the future, if/when richer modalities become standard (and the models can produce video in real-time), most people will be consuming their outputs as text. It's simply more convenient for most use-cases.


Having already seen this explored late '24, what ends up happening is that the end user generates apps that have lots of jank, quirks, and logical errors that they lack the ability to troubleshoot or resolve. Like the fast forward button corrupting their settings config, the cloud sync feature causing 100% CPU load, icons gradually drifting away from their original positions on each window resize event, or the GUI tutorial activating every time they switch views in the app. Even worse, because their app is the only one of its kind, there is no other human to turn to for advice.


Hopefully, people, and technology aren't stuck in late '24.


I found it genuinely impressive how useless their "GPTs" were.

Of course, part of it was due to the fact that the out-of-the-box models became so competent that there was no need for a customized model, especially when customization boiled down to barely more than some kind of custom system prompt and hidden instructions. I get the impression that's the same reason their fine-tuning services never took off either, since it was easier to just load necessary information into the context window of a standard instance.

Edit: In all fairness, this was before most tool use, connectors or MCP. I am at least open to the idea that these might allow for a reasonable value add, but I'm still skeptical.


    > I get the impression that's the same reason their fine-tuning services never took off either
Also, very few workloads that you'd want to use AI for are prime cases for fine-tuning. We had some cases where we used fine tuning because the work was repetitive enough that FT provided benefits in terms of speed and accuracy, but it was a very limited set of workloads.


> fine tuning because the work was repetitive enough that FT provided benefits in terms of speed and accuracy,

can you share anymore info on this. i am curious about what the usecase was and how it improved speed (of inference?) and accuracy.


Very typical e-commerce use cases processing scraped content: product categorization, review sentiment, etc. where the scope is very limited. We would process tens of thousands of these so faster inference with a cheaper model with FT was advantageous.

Disclaimer: this was in the 3.5 Turbo "era" so models like `nano` now might be cheap enough, good enough, fast enough to do this even without FT.


I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations. The rumors about an upcoming touchscreen Mac are interesting, perhaps Apple will deign to make their ridiculously overpowered SOCs usable for general purpose computing. A man can dream..


There are a number of interesting creative apps for iPad that can make full use of its capabilities. A good example is Nomad Sculpt. There's also CAD software, many DAWs. I haven't tested Numbers yet but I would assume its fairly well optimized.

This really reminds me of the 80/20 articles that made the frontpage yesterday. Just because a lot of HN users lament the fact that their 20% needs (can't run an LLM or compile large projects on an iPad) aren't met by an iPad doesn't mean that most people's needs can't be satisfied in a walled garden. The tablet form factor really is superior for a number of creative tasks where you can be both "hands on" with your work and "untethered". Nomad Sculpt in particular just feels like magic to me, with an Apple Pencil it's almost like being back in my high school pottery class without getting my hands dirty. And a lot of the time when you're doing creative work you're not necessarily doing a lot of tabbing back and forth, being able to float reference material over the top of your workspace is enough.

At this point Apple still recognizes that there is a large enough audience to keep selling MacBooks that are still general purpose computing devices to people who need them. Given their recent missteps in software, time will tell if they continue to recognize that need.


> There's also CAD software, many DAWs.

Assertions like this are what kill the iPad. Yes, DAWs "exist" but can only load the shitty AUs that Apple supports on the App Store. Professional plugins like Spectrasonics or U-He won't run on the iPad, only the Mac. CAD software "runs" but only supports the most basic parametric modeling. You're going to get your Macbook or Wintel machine to run your engineering workloads if that's your profession. Not because the iPad can't do these things, but because Apple recognizes that they can double their sales gimping good hardware. No such limitations exist on, say, the Surface lineup. It's wholly artificial.

I'm reminded of Damon Albarn's album The Fall - which he allegedly recorded on an iPad. It's far-and-away his least professional release, and there's no indication he ever returned to iOS for another album. Much like the iPad itself, The Fall is an enshrined gimmick fighting for recognition in a bibliography of genuinely important releases. Apple engineers aren't designing the next unibody Mac chassis on an iPad. They're not mixing, mastering and color-grading their advertisements on an iPad. God help them if they're shooting any footage with the dogshit 12MP camera they put on those things. iPads do nothing particularly well, which is acceptable for moseying around the web and playing Angry Birds but literally untenable in any industry with cutting-edge, creative or competitive software demands. Ask the pros.


It's such a shame that the iPad has these limitations. It's such an incredible device–lightweight, very well designed, incredible screen, great speakers, etc. I really do feel that if Apple sold a MacBook in the style of a Surface Book: iPad tablet running MacOS which could dock to a keyboard and trackpad with a potential performance boost (graphics card, storage, whatever), that it would be my dream device.


All I want is to put Linux on it. I already own copies of Bitwig et. al, if the iPad Mini didn't lock me into a terrible operating system then I might want to own one. But I'm not spending $300 for the "privilege" of dealing with iPadOS.


I think the fundamental barrier is that I have yet to see a system where mouse and touch can coexist as first-class input methods. Either your UI is optimized for touch with large input buttons and heavy reliance on gestures, or mouse with small input buttons that require precision to interact with and keyboard shortcuts for efficiency. The cognitive, not to mention physical burden of transitioning between a pointing device and a touchscreen means that users will favor one over the other. And if your UI has to target both audiences, then you're going to have to figure out how to seamlessly transition your UI or provide 2 parallel workflows, and at that point you might as well just segment your product.

It's easy to blame "Apple greedy" but optimizing either device to support an alternate input method degrades both. Apple is (supposed to be) all about a "polished" experience so this doesn't mesh with their design ethos. Any time I have seen a desktop environment get optimized for touch, people complain about it degrading the desktop experience. MacOS isn't even there yet and people are already complaining.

There are plenty of good AUs on the App Store (to name a few: DM10, Sonobus, the recent AudioKit modeled synths), but yes the selection of AUs on desktop is far greater. Most AU developers aren't going to pay the developer fee and go through the effort of developing, again, an entirely separate user interface, not to mention go through the app store approval process, to target a smaller market. It's a matter of familiarity. Just because your workflow depends on products that don't exist on iPad, doesn't mean that someone else's workflow isn't entirely productive without it. The entire industry is built on path dependence, so it's no wonder that software that has codebases that span decades and depend on backwards compatibility, i.e. the music production and CAD software, are not finding a lot of competition in the mobile space. Apple isn't designing their next unibody Mac chassis on the iPad, but that doesn't mean that a small business that makes 3D printed widgets isn't going to be happy using Onshape.

To be clear: I don't think an iPad is a _substitute_ for a desktop machine in most professional workflows. Partially due to path-dependence, and partially due to the greater information density that a desktop environment affords. But there are some workflows where the iPad feels like a much more _natural_ interface for the task at hand, and then that task can be transferred to the desktop machine for the parts where it isn't.


I would not want to use CAD software or a DAW without a proper mouse and keyboard, and maybe a 3D mouse too. An interface made for touch really isn't suitable. Even connecting a mouse to an iPad is a pretty shitty experience, since all the UI elements are too big and you have to wait around for animations to finish all the time.


Shapr3D is an interesting 3D design tool which has some CAD capabilities and an interface optimized for use with a stylus --- Moment of Inspiration was similarly oriented (I really ought to try it).


How is connecting a blue tooth mouse to an iPad any different than connecting to a computer? Especially with iPad OS 26?


That is just one very simple part, connecting the mouse. Literally everything else sucks on iOS. File management, hidden menus, running multiple apps, system management... and the list goes on. Need to convert a STEP file to something else on the iPad? Download 15 apps to see which one works, then try to find the converted file in the abomination of a file system? iOS is hot garbage.


iPadOS works differently from macOS and if you aren’t willing to learn that, you will think it is bad. The problem isn’t the OS.


What if you've learned how to work around many of iPadOS's limitations and still think those limitations are bad?

Downloading 15 different paid or free-with-in-app-purchases or free-with-ads apps to see which one actually does what it's supposed to do is one of those workarounds. I've learned how to do it and done it a bunch of times and I don't really like it. I much prefer the macOS/Windows/Linux workflow where there's typically some established, community run and trustworthy FOSS software to do whatever conversion you need.


I work with Logic Pro X often. I bought an iPad Pro M4 and the Logic version for it is really compelling. Touch faders and the UI are well thought out. The problem is they want me to subscribe to use it. I wish I could just outright purchase it for $300.


They should charge less if offering one-time. $300 only beats $50/year after 6-7 years, depending on the discount value you would assign to a present value calculation. In software it's more typical to calibrate that around 2-3 years. I like the design as well.


Logic Pro for Mac is $200.


>The problem is they want me to subscribe to use it.

WTH?? This is the first I am hearing this nonsense. Yet another reason why I won't get an iPad even though I am all in on Apple's ecosystem. It seems that Apple sees iPad users as the bottom feeders ripe for exploitation.


Yes but there is simply no reason to have two devices. There are a large number of Windows tablet-laptop combo machines that work perfectly well and prove touch apps work perfectly well on a desktop OS.

Yeah, that took a long time for MS to get to not suck after Windows 8, but touch and tablet interactions on Windows 10 and Windows 11 work perfectly well.


Having owned at least 3 such devices, I have to disagree. It "works", but desktop apps expect desktop interactions, and touchscreen functionality feels cobbled together at best. There are a handful of apps developed specifically for touchscreen PCs that work well, everything else is a toss-up. On the other hand, apps developed for a tablet OS support touch as a first class interaction, and have OS support for hardware keyboard and (usually) mouse input if you so choose. Not to mention that the vast majority of combo machines I have used are too heavy to use as a tablet for any reasonable amount of time, or have an incredibly clunky transition method. I have yet to see a platform where touch and mouse can both coexist as first-class input methods. Even the cognitive load of transitioning is an irritation.


> perhaps Apple will deign to make their ridiculously overpowered SOCs usable for general purpose computing

They've been doing exactly this since the first M1 MacBooks came out in 2020.


> I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations.

Literally everything you do gets the full power of the chips. They finish tasks faster using less power than previous chips. They can then use smaller batteries and thinner devices. A higher ceiling on performance is only one aspect of an upgraded CPU. A lower floor on energy consumed per task is typically much more important for mobile devices.


Right but what if I don't notice the difference between rendering a web page taking 100ms and it taking 50ms? What if I don't notice the difference between video playback consuming 20% of the chip's available compute and it consuming 10%?


I'm pretty sure that users of the announced Blender for iPad port will notice any additional horsepower.


what users?


> but what if I don't notice the difference between rendering a web page taking 100ms and it taking 50ms?

You probably won’t notice this when using the new machine.

For me, it only becomes noticeable when I go back to something slower.

It’s easy to take the new speed as a given.

> What if I don't notice the difference between video playback consuming 20% of the chip's available compute and it consuming 10%?

You would notice it in increased battery life. A CPU that finishes the task faster and more efficiently will get back into low power mode quicker.


Faster can also mean more efficient for a lot of tasks, because the cpu can idle sooner so your battery can last longer, or be smaller and lighter.


"Literally everything" doesn't amount to much if I can't actually control the stupid thing.


Since Apple actually makes a significant amount of money selling hardware itself, I really wonder why they actually wouldn't allow people to install Linux on it, with a full support. After all, it's not like this would jeopardize macOS/iPadOS AppStore earnings — Linux users would simply buy into Apple Hardware they haven't even considered before, and only a fraction of macOS/iPadOS users would switch to using Linux.


do they disallow it or just not provide active support? Active support requires paying for employees to keep it working. Ignoring it and having volunteers do it requires nothing.


You make it sound like these are the only two options, meanwhile what they _most importantly_ fail to deliver is documentation.

And that's for macOS. For any other platgorm they actively prohibit any third party operating systems.


I think the comment up one was about Linux on the iPad, which is mostly impossible. Well, iirc there are some projects to get, like, Alpine Linux running inside iOS, but it is emulated or something, and pretty slow, no gui, quite limited, etc etc.


Last I checked, Apple makes more revenue on services than on Mac and iPad combined. With higher profit margins.


Questions for you:

1. If you don't know what to do with it, why did you buy it?

2. If you wanted a general purpose computer, why did you buy an iPad?

3. Which iPadOS limitations are particularly painful for you?


>Which iPadOS limitations are particularly painful for you?

Browser engine lock-in - no Firefox+uBlock Origin = me no buy. And yes, there is Orion, which can run uBlock, but it and Safari have horrible UI/UX.


There are other differences with the iPad Pro lineup unrelated to the SoC. It's just strange to think that a very capable laptop chip is being put into a device with far more limitations.


I'd rather that than an underpowered chip.

It was mentioned, as almost a side comment somewhere, that the M chip is in there for multitasking and higher end image/video editing for "Pros". I could certainly use the M4 in an iPad Pro for iPadOS 26 and it's multitasking. I run into occasional slowness when multitasking on my M2 iPad Air.


1. I do know what to do with it. I take notes, a lot, in my work as a doctor. That's been the case since I owned an iPad Air from 2020, which I replaced with an 11 inch M1 iPad Pro (which broke), and I finally caved and bought a 13" iPad Pro to replace it. I ended up getting the M4 model because there just didn't seem to be older ones reasonably available. Even the M1 was more than fast enough for the overwhelming majority of iPadOS applicantions.

Why an iPad? Android tablets have been... not great for a long time. The pencil is very handy, and the ecosystem has the best apps. Also, I know a few rather handy tricks Safari can do, such as exporting entire webpages as PDF after a full-screen screenshot, that are very useful to my workflow.

2. I already own multiple general purpose computers. They're not as convenient as an iPad. My ridiculously powerful PC or even my decent laptop doesn't allow the same workflow. However, that's not an intentional software limitation, it's a consequence of their form factor, so I can't hold Microsoft to blame. On the other hand,Apple could easily make an iPad equivalent to a MacBook by getting out of the way.

3. The inability/difficulty of side-loading apps, the restriction to a locked down store. Refusing to make an interface that would allow for laptop-equivalent usage with an external/Bluetooth m+k. You can use an external monitor, but a 13" screen should already be perfectly good if window management and M+K usage wasn't subpar. Macs and iPads have near identical chips (the differences between an M chip for either are minor), and just being able to run MacOs apps on device would be very handy. Apple has allowed for developer opt-out emulation of iOS and iPadOS apps on Mac for a while now, why not the other way around?

If not obvious from the fact that I'm commenting on HN, I would gain utility from terminal access, the ability to compile and run apps on device, a better filesystem etc. Apple doesn't allow x86 emulators, nor can I just install Proton or Wine. If I can't side-load on a whim, it's not a general purpose computer. I can't use a browser that isn't just reskinned Safari, which rules out a great deal of obvious utility. There are a whole host of possible classes of apps, such as a torrent manager, which are allowed on other platforms but not on iPadOS. It's bullshit.

My pc and laptop simply aren't as convenient for the things I need an iPad for, and they can't be. On the other hand, my iPad could easily do many things I rely on a PC for, if Apple would get out of the way. iPadOS 26 is a step in the right direction, but there's dozens left to go.


Thanks for the response. I'd say you are a key target audience for the iPad Pro. I see most of your points. The only point I can't grok is "the CPU is too good", but I suppose that is more about lamenting the lack of Mac-like functionality than the CPU.

All I can say is: stay tuned.


> The rumors about an upcoming touchscreen Mac are interesting

What rumors have you seen? Anytime I've seen speculation, Apple execs seem to shut that idea down. Is there more evidence this is happening? If anything, Apple's recent moves to "macify" iPadOS indicate their strategy is to tempt people over into the locked down ecosystem, rather than bring the (more) open macOS to the iPad.


Current rumors point to the M6 generation of MBPs being a significant redesign and featuring an OLED touch panel screen.

I don't understand the appeal, even a little bit. Reaching up to touch the screen is awkward, and every large touchpanel I've used has had to trade off antiglare coating effectiveness to accomodate oleophobic coating. For me, this would be an objective downgrade — the touch capability would never get used, but poor antiglare would be a constant thorn in my side. I can only hope that it's an option and not mandatory, and I may upgrade once the M5 generation releases (which is supposedly just a spec bump) as insurance.


Smudges are off-putting... but, there are times when it would be very convenient to be able to scroll or click on a touchscreen. There are times when presenting when a touchscreen would be preferred over a mouse or touchpad. It's not often, but they are nice to have.

And, in regards to smudges, I mean, just don't use the touchscreen unless you have to and problem avoided.

Antiglare can be a thing but that can be avoided by avoiding string lighting behind you.


There's still the issue of accidentally triggering things (when e.g. adjusting the screen) and sometimes you don't have control of your surrounding lighting. I'd still prefer touch to be entirely optional.


It's convenient, and it also makes usage of a stylus far easier.

FWIW, I often rotate my Samsung Galaxy Book 3 Pro 360 so that the screen is in portrait mode, then hold the laptop as if it's a book and use a stylus and touch on the display with my right hand, and operate the keyboard for modifiers/shortcuts with my left, or open it flat on a lapdesk.


https://x.com/mingchikuo/status/1968249865940709538

> @mingchikuo

> MacBook models will feature a touch panel for the first time, further blurring the line with the iPad. This shift appears to reflect Apple’s long-term observation of iPad user behavior, indicating that in certain scenarios, touch controls can enhance both productivity and the overall user experience.

> 1. The OLED MacBook Pro, expected to enter mass production by late 2026, will incorporate a touch panel using on-cell touch technology.

> 2. The more affordable MacBook model powered by an iPhone processor, slated for mass production in 4Q25, will not support a touch panel. Specifications for its second-generation version, anticipated in 2027, remain under discussion and could include touch support.


> Anytime I've seen speculation, Apple execs seem to shut that idea down.

They also said they weren’t merging iOS and macOS, and with every release that becomes more of a lie.

https://www.youtube.com/watch?v=DOYikXbC6Fs


Strategies change. That was 7 years ago, pre-Apple Silicon. It turns out that people want windowing options on their large and expensive tablet, to do long-running tasks in the background, etc.


If that were all they were doing, nobody would be concerned. It’s the crapifying of the MacOS in order to make it work fine with a touch interface that drives everybody bonkers about the slow merge.


I have Tahoe and it’s just as good at being a desktop os as any of the previous os’s. Not sure what you’re referring to.


There have been lots of complaints all over the place that contradict your experience.

One article that talks about it: https://osxdaily.com/2025/09/19/why-im-holding-off-on-upgrad...

For less discerning users maybe the rough edges aren't that noticeable. But the point of choosing Apple products is you should be a discerning consumer.


That article mentions basically that they’ve heard that some apps crash a bit but it’s anecdotal and not uncommon with beta/new upgrades before a patch or two (not uncommon), and that he personally dislikes or has trouble with some of the transparency or other design changes.

Neither of those things worry me personally, and I think the previous user calling it a “crappification” is still somewhat of an overreaction. Obviously from an accessibility standpoint transparency/legibility is important but as far as I’m aware tweaks are being made and these things can also be turned off or modified in accessibility settings.


This a pretty insulting comment. I’m sure ther is better word then discerning.


I was actually trying to be more neutral :\ than saying something like for consumers with taste

the point i'm trying to make is that "apple consumers" are more critical.


I understand. I have been using a Mac since 1984 and I actually like glass more than the flat aesthetic we have been through. I see it as closer to Aqua and a subtler skeuomorphic effort. I have reported to Apple some of the problems liquid glass has. I see Liquid glass as better than what we had before.


Also me having a different experience from you doesn’t make me any less critical or discerning or makes me have less taste.

There’s no good way to phrase a thought that is fundamentally flawed.


Thank you


> That was 7 years ago, pre-Apple Silicon.

There have been rumours of Apple wanting to shift Macs to ARM chips for 14 years. When they made that announcement, they already knew.

https://en.wikipedia.org/wiki/Mac_transition_to_Apple_silico...

It was obvious it was going to happen. I remember seeing Apple announcing iPads doing tasks my Mac at the time could only dream of and thinking they would surely do the switch.

> It turns out that people want windowing options on their large and expensive tablet, to do long-running tasks in the background

The problem isn’t them making iOS (or iPadOS) more like macOS, it’s them doing the reverse.


> When they made that announcement, they already knew.

Yep, the ongoing convergence made that pretty clear. The emphatic "No" was to reassure 2018's macOS developers that they wouldn't need to rewrite their apps as xOS apps anytime soon, which was (and is) true 7 years later.

This is the same session where Craig said, "There are millions of iOS apps out there. We think some of them would look great on the Mac." and announced that Mohave would include xOS apps. Every developer there understood that, as time went on, they would be using more and more shared APIs and frameworks.

> The problem isn’t them making iOS (or iPadOS) more like macOS, it’s them doing the reverse.

That ship has sailed, but it's also completely overblown.


> but it's also completely overblown.

Speak for yourself. I for one despise the current direction of the Mac and the complete disregard for the (once good) Human Interface Guidelines. It’s everywhere on macOS now.

Simple example: The fugly switches which replaced checkboxes. Not only to they look wrong on the Mac, they’re less functional. With checkboxes you can click their text to toggle them; not so with the switches.

I’m not even going to touch on the Liquid Glass bugs, or I’d be writing a comment the length of the Iliad.


My apologies, I thought the "IMHO" was implied.

You'll be happy to know that checkboxes still exist and work like you'd expect. https://imgur.com/a/p2Xe1WL

Apple provides HIG guidance on switch vs. checkbox toggles here: https://developer.apple.com/design/human-interface-guideline... It boils down to, "Use the switch toggle style only in a list row".


Chances that there are both a folding iPhone and a Touchscreen Mac somewhere in the skunk works of Cupertino are 100%.

The Apple Vision Pro was a far more extreme product and was kept pretty well under wraps. (tho a market failure).


The line for market success for a first generation, $3500 VR headset is drawn in different places for different people.


Market success? I'd go with profitability.

Beta quality and expense without value are just some of the reasons for it's failure.


It'll get even weirder if the rumoured MacBook Lite with an iPhone processor ends up happening. Incredibly powerful tablets constrained by a dumbed down operating system, sold right next to much weaker laptops running a full fat desktop environment.


Well A19 Pro beats M1 in benchmarks so while the rumored MacBook might be weaker than mid to high-end iPads, it won’t be a slow machine in general.


Is that really rumored? Sounds like kind of a weak rumor to me. The MacBook Air already exists.

Apple already makes low cost versions of those, which are the previous models that they continue to manufacture.


https://www.macrumors.com/2025/06/30/new-macbook-with-a18-ch...

Apparently there are references to it in macOS already.


Interesting, seems like an odd choice but maybe the smaller SoC package brings costs down. I wonder if Apple will finally be making the $500 laptop they always claimed they couldn’t make.


The iPadOS limitations are largely orthogonal to being able to make use of the available performance, IMO. For example, search in large PDFs could certainly still be faster, and I don’t think it particularly suffers from iPadOS limitations.


I buy the higher end Apple products not because I plan to use all their power immediately, but because I keep my devices a very long time and want them to retain usability right to the end.


Same here. My launch-day M1 MBP is starting to show its age finally, M5 with twice the perf will be a nice upgrade.


Is it, tough? I feel like everything on my M1 is still as snappy as it was on day 1. My previous MacBook definitely showed it's game after 4 years, but I'm happy to use this one for at least another 2-4.


Mine is an M1 Max and gives me no gripes after four years. Like you, I also felt as though past laptops felt their age sooner. I'm typically using Photoshop, Lightroom, Resolve, Docker and other usual stuff at any given time.


I wasn't super informed on the Apple silicon laptops, so I was kind of disappointed when my last job gave me a 2-3 year old M1 Max laptop.

It blew the doors off every other laptop and desktop I've had (before or since).

When I think back to how quickly obsolete things became when I was younger (ie 90s and 00s), today's devices seem to last forever.


For me it was the memory limits more than CPU speed. Discord, Slack, Teams and a browser, and 16gb on my M1 was basically used up.


And here I am struggling with the 32gb version, always need more :P


I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations.

It's a nice problem to have, since for most of computing history it's been the other way around. (Meaning the hardware was the constraint, not the OS.)


I disagree. For a lot of the personal computing era, the problems with OSes and hardware were mostly a matter of technical progress. The problem with iPadOS is totally different; it's a problem that was basically manufactured in and of itself, and completely artificial at that. I do not think this is a good problem to have at all.


I don’t think you’re representing the state of iPad accurately.

In iPadOS 26, more extensive multi-window multitasking like Mac was added.

The quantity of windows you can keep open at once depends on your iPad’s SoC.

If you have a newer Pro iPad with more memory you can keep more of them open and slow down happens further down the rabbit hole.

The hardware is being pushed and used.

As another example, the iPad has pretty intensive legitimate professional content creation apps now (Logic Pro, Final Cut Pro, and others).

So there are people out there pushing the hardware, although I’ll join those who will say that it’s not for me specifically.


I don't suggest the problem is that the hardware can't be "pushed and used", the problem is that the hardware is being artificially limited by Apple for some unknowable reason. (Well, the reason is knowable, but I'm sure some would dispute it anyways. It's very clearly an extreme defense of their 30% cut on all software that runs on iOS devices.) This is not a question, it doesn't matter what people are doing with iPads, it really is happening. A good example, the first iPad with hardware virtualization support in its CPU could initially run VMs provided you had a jailbreak, but then Apple entirely removed the virtualization framework from the following iPadOS update.

There is no particular reason a general purpose computer should be "not for me specifically" in terms of what you can do in software. In terms of design, sure. But not in terms of what you can do in software.

(I have a suspicion the same reason is responsible for why you basically don't find open source software on iOS devices the way you would on even Android or Windows; it doesn't make any money to take a cut out of.)


I suppose, that's an interesting way of framing it - yet in my gut I feel like I am paying for something that I am locked away from.

Sometimes though Youtube will make the iPad uncomfortably hot and consume the battery at an insane pace.

So, I guess there's someone using the performance.


>> I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations.

> It's a nice problem to have, since for most of computing history it's been the other way around. (Meaning the hardware was the constraint, not the OS.)

For anyone who works with (full-size) image or video processing, the hardware is still the constraint... Things like high-ISO noise reduction are a 20-second process for a single image.

I would be happy to have a laptop that was 10x as fast as my MacBook Pro.


I don't think hardware has been a real constraint since the Pentium era. We've been living in a world of CPU surplus for close to 2 and a half decades, now.


I've been RAM limited more than CPU limited for some time. In my personal workflows, 32GB was not enough and I'd receive out of memory errors. I bumped that up to 64GB and the memory errors went away. This was in a Hackintosh so RAM upgrades were possible. I've never tried an M* series chip to see how it would behave with the same workflow with the lower RAM available in affordable machines.


> Apple will deign to make their ridiculously overpowered SOCs usable for general purpose computing

Did everyone forget that these chips started in general purpose MacBooks and were later put in the iPad?

If general purpose computing is the goal you can get a cheap Mac Mini


> can't figure out what to do with even a fraction of the horsepower

That's sort of the funny thing here. Apple's situation is almost the perfect inverse of Intel's. Intel fell completely off the wagon[1], but they did so at exactly the moment where the arc of innovation hit a wall and could do the least damage. They're merely bad, but are still selling plenty of chips and their devices work... just fine!

Apple, on the other hand, launched a shocking, world-beating product line that destroys its competition in basically all measurable ways into a market that... just doesn't care that much anymore. All the stuff we want to spend transistors on moved into the cloud. Games live on GPUs and not unified SOCs. A handful of AI nerds does not much of a market make.

And iOS... I mean, as mentioned what are you even going to do with all that? Even the comparatively-very-disappointing Pixel 10 (I haven't even upgrade my 9!) is still a totally great all-day phone with great features.

[1] As of right now, unless 18A rides in to save them, Intel's best process is almost five YEARS behind the industry leader's.


It’s surprising to me MacBooks have such low market share. I got my first Mac after using Windows all my life and I’m stunned. The laptop: 1. Lasts all day on battery 2. Doesn’t get hot 3. Compiles code twice as fast as my new Windows desktop

I really don’t like macOS but I’ve shifted to recommending Mac to all my friends and family given the battery, portability and, and speed.


I won't buy or recommend one just on principle. I've spent way too much of my life advocating for open firmware and user-customizable systems to throw it all in the trash for a few hours of battery. I absolutely tell everyone they're the best, and why, but my daily driver has been a Linux box of some form (OK fine I have a windows rig for gaming too) for decades, and that's not changing.

Also, again, most folks just don't care. And of the remainder:

> Compiles code twice as fast as my new Windows desktop

That's because MS's filesystem layer has been garbage since NT was launched decades ago and they've never managed to catch up. Also if you're not apples/applesing and are measuring native C/C++ builds: VS is an OK optimizer but lags clang badly in build speed. The actual CPU is faster, but not by nearly 2x.


>> Compiles code twice as fast as my new Windows desktop

>That's because MS's filesystem layer has been garbage since NT was launched decades ago [...]

I confess that this kind of excuse drives me batty. End users don't buy CPUs and buy filesystems. They buy entire systems. "Well, it's not really that much faster, it's just that part of the system is junk. The rest is comparable!" That may be, but the end result for the person you're talking to is that their Windows PC compiles code at half the speed of their Mac. It's not like they bought it and selected the glacial filesystem, or even had a choice in the matter.

That's right up there with "my Intel integrated graphics gets lower FPS than my Nvidia card." "But the CPU is faster!" Possibly true, but still totally irrelevant if the rest of the system can't keep up.


> End users don't buy CPUs and buy filesystems. They buy entire systems. [...] Possibly true, but still totally irrelevant if the rest of the system can't keep up.

At least historically for hardware components of PCs, this was not irrelevant, but the state of things:

You basically bought some PC as a starting basis. Because of the high speed of improvements, everybody knew that you would soon replace parts as you deemed feasible. If some component was not suitable anymore, you swapped it (upgrade the PC). You bought a new PC if things got insanely outdated, and updating was not worth the money anymore. With this new PC, the cycle of updating components started back from the beginning.


But that still doesn't save away, "oh, it's only slow because the filesystem is so slow". Assuming that's true, that's a very integral part of the system that can't readily be swapped out by most people. You can't say "the system is actually really fast, it's just the OS that's slow", because the end result is just plain "the system is slow."


> You can't say "the system is actually really fast, it's just the OS that's slow", because the end result is just plain "the system is slow."

If performance is so critical, people do find ways around this. Just to give an arbitrary example since you mention file systems:

Oracle implemented their own filesystem (ASM Cluster File System (ACFS)):

> https://en.wikipedia.org/w/index.php?title=Oracle_Cloud_File...

"ACFS provides direct I/O for Oracle database I/O workloads. ACFS implements indirect I/O however for general purpose files that typically perform small I/O for better response time."


> I confess that this kind of excuse drives me batty.

The use case was "compiling code". My assumption was that anyone buying hardware for that application would understand stuff like filesystem performance, system tuning, also stuff like "how to use a ramdisk" or "how to install Linux".

Yes, if you want to flame about the whole system experience: Apple's is the best, period. But not because they're twice as fast, that's ridiculous.


It definitely depends on what circles you run in. When someone I know or is a degree of separation away from me pulls out a PC, it is always a little bit of a surprise.


Regarding market share and your friends and family recommendations, you’re thinking first world. Rest of the world wants and can only afford sub-$500 laptops.


I’ve found that the $1000 Mac laptop is worth about $500 after 3 years and the $500 laptop is worth $50. The price difference over time really isn’t that big and the Mac is going to have a better trackpad and display and longer battery life.


Yeah but in the longer term the price trends to $0 either way, and Windows will get software support for longer.

My mom is happily using a Lenovo from 2013 and looking to upgrade because it doesn't support Windows 11 and Win10 is running out of support. A contemporary Mac would have been the 2012 Mac Mini which would have received its final OS update with 10.15 Catalina in 2019, and would have received its final security update in 2022. (Desktop, so no difference in peripherals, etc.)

Incidentally, I actually purchased both the Lenovo and a 2012 Mac Mini (refurb) so I have the pricing data - the Lenovo was $540 and the Mac Mini was $730 - and then both took aftermarket memory and SSD upgrades.


That just means that the not-Mac is way more accessible. The high resale value makes Macs more expensive overall for everybody.

Also a lot of people prefer windows. It’s got a lot more applications than Mac. It has way more historical enterprise support and management capability as well. If you had a Mac at a big company 20 years ago the IT tooling was trash compared to windows. It’s probably still inferior to this day.


> It’s got a lot more applications than Mac.

The Mac can (legally) run more software than any other computer. Aside from macOS software, there's a bunch of iOS and iPadOS software that you can run, and you can run a Windows, Linux, and Android software via VMs.


Yeah…I don’t think so. Moving the goalposts to include Parallels/VMs and iOS/iPadOS apps that lack a touch screen on on Mac and are frequently blocked from being run on Mac by developers doesn’t count.

Let’s not forget that you’re now talking about buying a $100/year license; in just a few years you could buy a whole Windows computer with a permanent license for that money.

And if you’re going to talk about how great VMs are on Mac we can’t leave out how it’s the worst Docker/podman platform available.


If your $1000 MacBook breaks after a year you need $1000 to repair it.

A 500 laptop is probably more repairable and worst case you pay $500 to get a new one. Not to mention battery replacement etc.

The expected total cost of ownership is very high for a Mac. It’s like owning a Mercedes. Maybe you can afford to buy one, but you cannot afford maintenance.


As a sibling comment said, what maintenance? The only problem I’ve ever had with any Mac was a bad keyboard on my M4 MBP, and that showed itself so quickly that even without AppleCare it would have been covered.

Between work and personal, I’ve had an Intel Air, 2x Intel Pros, M1 Air, 2x M3 Pros, and an M4 Pro. My wife has an M1 Air. My in-laws have an M3 iMac. My mom has… some version of an Apple Silicon laptop.

That is a decent amount of computers stretching over many years. The only maintenance required has been the aforementioned keyboard.


My shift button popped off my M1 MacBook. Apple judged it was my fault. Guess the repair price. Yes almost full laptop price.

If that had happened to any other laptop I would be able either replace just the broken keycap, or just the keyboard.

And no, apple care+ that covers accidents is not cheap either at $150/year.


Oh, come on. Laptops are mobile devices that live in bags and backpacks and they break all the time. I've had more laptop failures than cracked phones, even. You absolutely need an answer for "what happens if my screen gets cracked", just ask any college student. Windows junk is cheaper, it just is.

In pre-college education, the answer is often "use any other junky Chromebook from anywhere in the world", which is cheaper still.


Maybe you need a better backpack? I’ve had zero cracked screens. I’ve also never cracked my phone screen, though, so there’s that.

I did drop my watch last week, and the second hand fell off, though.


The very existence of the Genius Bar falsifies your point, though. The fact that you, personally, are exceedingly careful about your devices isn't an argument against the clear truth that (1) the rest of us yahoos clearly aren't and (2) macs are expensive to repair.


What maintenance? AppleCare also exists if you worry about such things.


larger initial purchases are harder on the lower income earners regardless of the long term value they offer; that's one of the hard parts about being poor, it also makes positive economic decisions harder to accomplish.


> I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower

Look at glassy UIs. Worth it.


AFAICT, lots of "AI" related stuff runs slow on M1,M2,M3,M4

I don't know if this already exists but it would be nice to see these added to benchmarks. Maybe it's possible to get Apple devices to do stable diffusion and related tech faster and just needs some incentives (winning benchmarks) for people to spend some effort. Otherwise though, my Apple Silicon is way slower than my consumer level NVidia Silicon


No, iPad Pro won't be faster than 4090s or 4070s (or even 5% of the speed of 4090).

But newer chips might contain Neural Accelerator to close the gap a little bit (i.e. 10%??).

(I maintain https://apps.apple.com/us/app/draw-things-ai-generation/id64...)


What improvements did the A19 Pro provide for Draw Things?



That's amazing! Curious how this will translate to the M5 Pro/Max Macs...


Emulators because IpadOs doesn’t allow dynamic dispatch so you need as much CPU as possible.


I'd have liked more explanation of the actual solutions that programmers used at the time.


For checking? Just a lookup on disk (no db, just a large list with a custom index, then binary search in the retrieved block). Decoding anything was slow, and in-core was basically out of the question [1]. Caching was important, though, since just a handful of words make up 50% of the text.

I once built a spell checker plus corrector which had to run in 32kB under a DOS hotkey, interacting with some word processor. On top of that, it had to run from CD ROM, and respond within a second. I could do 4 lookups, in blocks of 8kB, which gave me the option to look up the word in normal order, in reverse order, and a phonetic transcription in both directions. Each 8kB block contained quite a few words, can't remember how many. Then counting the similarities, and returning them as a sorted list. It wasn't perfect, but worked reasonably well.

[1] Adding that for professional spell checking you'd need at least 100k lemmata plus all inflections plus information per word if you have to accept compounds/agglutination.



The article is about fitting large dictionaries into small memory footprints. Writing a 200K word spell checker on a machine with only 256K memory.

When you need to store your dictionary in under 1 byte per word, a trie won't cut it.


The limit given in the article is 360KB (on floppy). At that size, you can't use Tries, you need lossy compression. A Bloom filter can get you 1 in 359 false positives with the size of word list given https://hur.st/bloomfilter/?n=234936&p=&m=360KB&k=

The error rate goes up to 1 in 66 for 256KB (in memory only);


according to https://en.wikipedia.org/wiki/Ispell ispell (1971) already used Levenshtein Distance (although from the article it is not stated if this already existed in the original version, or if it was added in later years).


Levenshtein distance up to 1, according to that article. If you have a hierarchical structure (trie or a DAG; in some sense, a DAG is a trie, but stored more efficiently, with the disadvantage that adding or removing words is hard) with valid words, it is not hard to check what words satisfy that. If you only do the inexact search after looking for the exact word and finding it missing I think it also won’t be too slow when given ‘normal’ text to spell-check.



The first article I read about the techniques used in the spell program was the 1985 May issue of Communications of the ACM (CACM for those who know), https://dl.acm.org/toc/cacm/1985/28/5, in Jon Bentley's Programming Pearls column.

Not as much detail as the blog.codingconfessions.com article mentioned above, maybe some of the other/later techniques were added later on?

Link to the online version of the 1985 May Programming Pearls column: https://dl.acm.org/doi/10.1145/3532.315102

The PDF version of that article: https://dl.acm.org/doi/pdf/10.1145/3532.315102


I never thought I'd end up posted on HN! I was wondering why on earth Substack's analytics were showing visitors from here. If anyone has any questions, I'm happy to answer here.


To me, it seemed to be a visual equivalent of that auditory trick where a note seems to descend or ascend in pitch indefinitely. The outer aura of color seemed to be shrinking constantly.


The Shepard Tone - https://en.wikipedia.org/wiki/Shepard_tone - brilliantly used by Hans Zimmer in the Dunkirk score.


The person who guessed 16% would have a lower Brier score (lower is better) and someone who estimated 100%, beyond being correct, would have the lowest possible value.


I'm not saying there aren't ways to measure this (bayesian statistics does exist after all), I'm saying the difference is not worth arguing about who was right. Or even who had a better guess.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: