Hacker Newsnew | past | comments | ask | show | jobs | submit | more markhalonen's commentslogin

"you will have the average complexion and you will like it" rofl


who are you quoting?


it seems you accept having your skin color changed by the iPhone algorithm... I do not accept that so was making light of it


All cameras imprint their own color signature to photos, so I really don't understand what you're talking about. Some people buy exclusively Canon cameras because their JPEG profiles give "good" skin tones straight out of the camera. Does that mean they are "accepting" Canon's opinion of what skin should look like?

Yes. Everyone does, with every manufacturer, and Apple evidently has determined their visual style. At least they also provide you with an optional semi-raw output you can freely edit if you so desire.


The iPhone photo in the blog is iPhone 16 Pro which has the 3 lenses (I am the author).


And which lens was used?


which lens on my iPhone? Just the camera app like everyone else. For the good photo, it's a 30mm lens on Sony a6400 (45mm equivalent)


There are three different lenses on the iPhone 16 Pro. Which one gets used is determined by the "zoom" level you pick. The "0.5x" picks the widest angle lens, the "1x" and "2x" use the same lens, and the "5x" uses the third lens.

If you wish to reduce optical distortion and can get farther away from the subject, you'll want to pick the "5x" zoom. Think somebody else here said it was a 105mm equivalent, which sounds about right.

Intermediate values are obviously crops... although given that the 0.5x and the 1x lens are both 48mp sensors (IIRC), and the resulting image is typically 12mp, it doesn't make as big of a quality difference as one might ordinarily think.


Yes, but on the camera app, you should set 3x to use the longest lens. This will avoid distortion.


It appears the long lens on that phone is 120mm-equivalent ("5x") and any intermediate zoom is just cropping. A 2x "zoom" (crop) would get pretty close to the field of view of the author's dedicated camera lens, but with further reduced image quality.

Actually using the iPhone telephoto for a group photo like the one shown in the article would require the photographer to stand a considerable distance from the subjects, and then we might start noticing a little perspective distortion from the 45mm-equivalent lens on the Sony.


[flagged]


What’s with this overtly hostile attitude.


I was born in 95, so my childhood is well documented by my mothers digicam. When I look back at the photos, it is very obvious they are way better than iPhone photos that many parents are taking today.


While i don't disagree, it's good to take into consideration the way people took photos back then vs now. I'd argue that today they are more of a commodity than they were back then, so people thought more before they took the shot(at least for some photos).


woah I am the author. I don't even have analytics set up on this site, but hope everyone enjoys it!


Feedback: I absolutely love the idea of doing analysis like this, but it's incredibly frustrating to be shown photos that were clearly taken at different times when the subjects naturally don't look exactly the same. Like for example who's to say that player isn't actually leaning? The second photo sure doesn't prove anything. And comparing them side by side feels like an exercise in frustration.

I would probably (if possible) repeat this idea but with photos taken at the same time, with cameras as close to each other as possible. If at all possible I would also try to use as similar of a lens as possible, if only as a 3rd comparison point to compare the other two to.


The building shot perfectly illustrates all his points, very little difference between them.

The child in the surf is almost identical. Maybe a few ms of difference, look at the foot position.

The facial structure differences in the players were striking despite not being identical shots.


you'll have to believe me when I say they are not leaning. They were just standing there posing for the photo.

Would love for someone else to get more scientific about it, but I think the results would be the same.


> you'll have to believe me when I say they are not leaning. They were just standing there posing for the photo.

I mean, if believing your words were enough to convey the message, then there'd be no point in taking the second photo and comparing them.

The point here isn't whether you're telling the truth (of course you are), it's about being able to see what's going on and get an intuitive feel for what changes and what stays the same. When I said "who's to say they're not leaning" my point wasn't to call you a liar; it was to say that that question is what immediately arises in your audience's brain, and it's completely distracting. Trust can't correct for the visual discrepancy, even if I had taken for the photo myself.


I was pretty irked by that as well. The change from smiling to not smiling affects face shape. But at least the building and car photos were stationary enough to illustrate the fisheye quality.


I like the comparisons! I think it's 100% fair to compare the "out of the box" images from the iPhone to other cameras. With that said, some notes:

I think a lot of the differences you're seeing are the result of FOV differences; the iPhone camera is a ~24mm equivalent, which is much wider than most people would shoot on a dedicated camera. That wide-angle distortion is just a natural part of the 24mm focal length, but not really the iPhone's fault.

The other effects you're seeing are related to Apple's default image processing, which, at this point, most people would agree is too aggressive. This difference goes away if you shoot in ProRAW and process your photos in an app that allows you to dial down (or ideally turn off) local tone mapping.

If you have an iPhone that shoots 48MP ProRAW, don't be afraid to crop the image significantly, which increases the effective focal length and makes the image look more like a dedicated camera. It also increases the apparent bokeh, which is actually quite noticeable on close-ups. With the RAW you can then quickly edit the image to end up colors which are much more faithful and natural.

If anyone out there doesn't have a Pro model, they can shoot RAW photos in 3rd party camera apps, including Lightroom, which is free.


One observation i'd expected to see is sensor size versus apparent focal length - this might be at least one of the reasons for distorsion. iPhone camera is ±7mm, which is ±4x crop factor in 35mm terms - but it's marketed as ±26mm.


There is apps like Halide or Photon that have a Process Zero or TrueRaw mode that is more natural. Of course a phone is just an other tool with different constraints. I gave up paying 2 or 3 times the price of my phone for a dedicated camera. I like the lightness and integrated software to edit photos and share them on the spot. I made that sacrifice knowing I’ll never have the same quality but I don’t have to carry a big camera now. But for passionate people who want the best you can’t replace a dedicated camera with a phone


What kind of camera was used for the non-iphone shots?


sony a6400 with sigma 30mm f/1.4, but then the child one is a 2004 Digicam I think a Sony Cyber-shot DSC-W5


Much of the criticism of the Iphone photos is the fisheye effect. This is exaggerated, because you took the photos from different distances. If the Iphone photos were taken at the same distance, a cropped version of the Iphone photo would have identical perspective.


Here's a cool writeup from PC Mag 1995 about accounting softwares: https://books.google.com/books?id=yurvRCerf_UC&pg=PA273#v=on...

Not much has changed!


I suggest https://perspective.finos.org/ for data viz to be built in. We use DuckDB paired with Perspective for client-side BI use case, and it's been great.


+1

we're using Perspective in crabwalk[0] (it's like dbt specifically built for duckdb and written in rust) and it's amazing paired with duckdb. Near instant loads for hundreds of thousands of rows and you can keep everything in arrow.

0 - https://github.com/definite-app/crabwalk


Where are you using/advocating crabwalk?

It does look interesting, but for the local ETL use case, I am missing the pitch on just having my own collection of SQL scripts. Presumably the all-local case needs less complexity. Unless the idea is that this will eventually support more connectors/backends and work as a full dbt replacement?


A few features:

* Built-in column level lineage (i.e. dump in 20 .sql files and crabwalk automatically figures out lineage)

* Visualize the lineage

* Clean handling of input / output (e.g. simply specify @config output and you can export results to parquet, csv, etc.)

* Tests are not yet implemented, but crabwalk will have built-in support for tests (e.g. uniqueness, joins, etc.)

we're using it in our product (https://www.definite.app/), but only for lineage right now.


Glad you dig it! Check out our pro version to - it also support DuckDB, Python/Pyodide and more! https://prospective.co


Wow that's really cool! Part of my PhD thesis was about writing stable treemapping algorithms for temporal data. The idea being that you want your treemap cells not to fly around like what I'm seeing in your demo, but to remain more or less in the same position without sacrificing too much on the cells aspect ratios. We've come up with a pretty effective and fast method to do that, check out the paper and a demo down below. Maybe we could even do a collaboration to get this implemented in perspective.

https://github.com/EduardoVernier/eduardovernier.github.io/b...

https://youtu.be/Bf-MRxhNMdI?list=PLy5Y4CMtJ7mKaUBrSZ3YgwrFY... (see the GIT method)


That looks much better, thanks I will read up.


Have a look at https://sql-workbench.com eventually, as it's using DuckDB WASM & Perspective to render the query results. Let me know what you think!


This is actually how I discovered Perspective!


Hahaha, nice. It's a small world.


The online demo looks great and promising, too bad it's unusable for me. I've tried installing it with conda from conda-forge and no luck. I've tried installing it with pip, the same. I've also cloned the repository from github, tried to build it and failed, but I don't remember the details.

Why is some software so difficult to install beats me.


Have you ever reported an issue? I use perspective heavily on a variety of platforms both conda and pypi without any problems.


Not yet, because I wanted to give it one more try while documenting all the steps.


Why Perspective? If going for a D3 wrapper, Plot would offer more flexibility.


We've built a nice integration for Plot + DuckDB, found here: https://www.duckplot.com/!


I recently evaluated web tech stacks and my thesis is that Next.js is the most powerful and will win in the long run, but right now it's a bit too new for most web projects -- the ones that are low-traffic, simple apps. Migrating your codebase to another version is just not that fun, and I deemed that still too common in Next.js


Next.js is already winning: https://trends.stackoverflow.co/?tags=next.js,ruby-on-rails,...

There are many older codebases running those well-established frameworks, but a lot of job positions, new code, and I bet half the recent YC batch web apps, are Next.js. JSX for frontend is so much better than the other templates imo


There are trends and vibes and Next.js winning. Will it be the thing we will use in three years? Who knows, but I would not necessarily bet on it, because all those technologies eventually are replaced by something else.

Next.js in particular is well known already at this point that some people are intentionally choosing not to use it (because they ran into some of the challenges at their previous company).


> but right now it's a bit too new

Too new? It's 8 years old!


Your thesis isn't very good. Nextjs has a huge marketing budget and that's it


implementing double-entry accounting for the 10,000th time in human history on our ERP/MES product.


I can say with 100% confidence that GraphQL via Postgraphile & TypeScript is a stellar tech stack for web applications beyond "hello world".

If you are building on Postgraphile and raising from angels, dm me.


one starts to understand the imperative of colonizing mars...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: