Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

(Supposedly) Lee Gambles comment on Reddit -

"Hi. I am the original taker of the photo. There is a screen that normally shows peppes pizza advertisements in front of peppes pizza in Oslo S. The advertisements had crashed revealing what was running underneath the ads. As I approached the screen to take a picture, the screen began scrolling with my generic information - That I am young male (sorry my profile picture was misleading, not a woman), wearing glasses, where I was looking, and if I was smiling and how much I was smiling. The intention behind my original post on facebook was merely to point out that people may not know that these sort of demographics are being collected about them merely by approaching and looking at an advertisement. the camera was not, at a glance, evident. It was merely meant as informational, maybe to point out what we all know or suspect anyway, but just to put it out in the open. I believe the only intent behind the data collected is engagement and demographic statistics for better targeted advertisements."

Source: https://www.reddit.com/r/norge/comments/67jox4/denne_kr%C3%A...



Not Lee Gamble. He just shared the photo without any source. See this article: https://translate.google.com/translate?sl=auto&tl=en&js=y&pr.... Photo taken by Jeff Newman.


It is still a BIG ethical issue for some people. Myself I see this as just the natural progression we are headed. If we don't have rules about this kind of technology it will very much be "Minority Report" in a decade.


How is it an ethical issue for strangers to look at your face when you are out in public?


More like strangers taking pictures of you without your consent (and often knowledge) with the intent to increase their profits and not sharing any of that with you.


And then able to follow you and know your present location whenever you go by a camera.

Minority Personalized Advertising https://www.youtube.com/watch?v=7bXJ_obaiYQ


Ethically, neither your consent or knowledge is required for someone to see you in public and remember that image. Why they do it isn't really relevant. If they use that image to do something unethical, like commit fraud, it is the fraud that is unethical, not the imaging.


I think many people would disagree. It might be legal, but that doesn't mean it's not unethical.

If a stranger on the street started following me, taking pictures without permission, and taking notes about my appearance of actions and storing it in their database, I would say he was behaving unethically.

Ask street photographers - it's a delicate balance. Many people really dislike having their pictures taken without their permission.


How does this compare to the pre-automation practice mentioned above of cashiers manually making a tally of how many men/women of each age group were visiting?

I mean, this is literally the "global village" coming to fruition. The online shopkeeper knows you just as well as a shopkeeper in a real village - it knows who you are, it remembers all your previous visits, it knows your hobbies (even if you didn't tell him about them, but someone else in the village), it can make suggestions based on that.

When you buy flowers, the village shopkeeper knows not only who's buying them, but also has a good idea for whom these flowers are intended. That's where we're heading.

This is the level of (non)privacy that we historically had, living in much smaller communities than modern cities. The trend of more anonymity brought by urbanization is reversing, but it's not something new or horrible, if anything, the possibility of being just another face in the crowd is an anomaly that existed for a (historically) short time and is slowly coming to an end once more.


That is a lot of words to simply say that some people think it is unethical. Which is an essentially empty statement. Couldn't you at least say most people and make it an argumentum ad populum?


lawful and ethical are two totally different things. They are both related but are mutually exclusive of each other

ethical != opinion

Ethical has weight and you can lose your job, and even go to jail for being unethical. RMS actually has a very strong academic ethical mind (Even though I disagree with him more then agree). BUT ethics isn't easily defined.

Here is a decent link to defining Ethics. https://www.scu.edu/ethics/ethics-resources/ethical-decision...


>lawful and ethical are two totally different things.

Mind explaining where you think I implied otherwise? Or why you keep repeating this despite the fact that I haven't?

>Ethical has weight and you can lose your job, and even go to jail for [what your boss thinks is] unethical.

For example, doctors who get fired for performing abortions, something that most of hn doesn't find unethical.

>https://www.scu.edu/ethics/ethics-resources/ethical-decision...

That is itself, merely what Manuel Velasquez, Claire Andre, Thomas Shanks, S.J., and Michael J. Meyer find ethical.


Until relatively recently, unavailability of large stockpiles of consumer data (at least, stockpiles at the scale now possible) was a significant impediment to a large and probably mostly-undiscovered class of potentially unethical behavior. Do you not suppose the removal of that impediment, with no other equally powerful compensatory regulations or oversight, to at least potentially be a serious problem now or at any time in the future?


Until relatively recently, unavailability of large stockpiles of consumer data (at least, stockpiles at the scale now possible) was a significant impediment to a large and probably mostly-undiscovered class of potentially ethical behavior, as well as behavior that actively combats unethical behavior. Data itself is amoral and can be used for either good or bad.


Ethical behavior isn't a problem.


You might even say it's the opposite.


This is a gross mischaracterization​ of the issue. You aren't looking at the bigger picture.

Do we really want to commodotize​ the simple act of walking down the boulevard? Make every moment in public space (and private digital space!) sliced, diced, and scrutinized by God knows how many data munchers, middlemen, analytics brokers, and ethically challenged people in order compel as much thoughtless consumer spending as possible, long term consequences be damned? Allow incredibly detailed profiles to be built up on every person, spanning the decades of their life? And of course, there is always the danger of governments co-opting and abusing this information years or decades in the future, after adminstrations have come and gone, and laws have been overturned, drastically altered, or ignored. As the tech and richness of the data increases, the temptations will as well. Well meaning people can do nefarious things in certain contexts.

I believe our societal institutions and corporate entities are not mature enough to safely handle the power granted by unrestrained, high resolution data on the entire populace

Granted, I don't think things would get too terrible without overwhelming protest, but I don't see why we should bet on that.


"You aren't looking at the bigger picture" is just an arrogant way to say "I think you're wrong and I'm right". It can be safely omitted in favor of actual arguments.

>Do we really want to commodotize​ the simple act of walking down the boulevard?

It's not a boulevard you are walking down, but a bazaar. The only difference is that modern technology allows you to visit the bazaar to be "sliced, diced, and scrutinized by God knows how many data munchers, middlemen, analytics brokers, and ethically challenged people in order compel as much thoughtless consumer spending as possible" without physically travelling there.

>Allow incredibly detailed profiles to be built up on every person, spanning the decades of their life?

Sure. It's called a relationship. Or a memory.

>And of course, there is always the danger of governments co-opting and abusing this information years or decades in the future, after adminstrations have come and gone, and laws have been overturned, drastically altered, or ignored.

You can safely replace "this information" with virtually anything useful and get the same effect. Do you feel the same about, say, nuclear weapons? Or legal authority to lock people in cages? I would say either of those is far more dangerous than data. Yet we recognize that the power exists regardless, and the government can at least put it to good use.

>I believe our societal institutions and corporate entities are not mature enough to safely handle the power granted by unrestrained, high resolution data on the entire populace

Then the obvious answer is to improve societal institutions and corporate entities, which is useful in and of itself, rather than futilely trying to impede the progress of technology.


> It can be safely omitted in favor of actual arguments.

Fair point, I could have dropped that sentence. I stand by my gross mischaracterization statement, though. Programmatic surveillance is very different from a stranger looking at someone.

> Sure. It's called a relationship. Or a memory.

The profile built up on people by ad brokers and spy agencies is a relationship? I don't think that's how most people would describe it.

> You can safely replace "this information" with virtually anything useful and get the same effect. Do you feel the same about, say, nuclear weapons? Or legal authority to lock people in cages?

Uh, a core part of the problem is this information being coupled with the ability to lock people in cages (or exert power in other ways). Obviously the data by itself is inert and useless. It's what people might do with it that matters.

Important examples would be restrictions on free speech and suppression of dissent. Imagine something like a credit score 2.0, created by analyzing a lifetime of private communication, online activity, and transactional data.

Those websites you visited 12 years ago? It's gonna cost you on your next car loan. And don't even think of running for city council -- the dirt will really come out then. Etc etc.

Obviously, technology brings a lot of great benefits. I'm all for that. I think we should just be aware of new pitfalls it brings as well, and try to account for them.


>The profile built up on people by ad brokers and spy agencies is a relationship? I don't think that's how most people would describe it.

Most people use language woefully imprecisely. The relationship I have with the barista at the cafe near my office isn't the same as the relationship I have with my sister but it is a relationship of the kind that's relevant here. Knowing what I order and when, recognizing me, etc.

>Uh, a core part of the problem is this information being coupled with the ability to lock people in cages (or exert power in other ways). Obviously the data by itself is inert and useless. It's what people might do with it that matters.

A nice thought, but in practice, when we try to fragment this power by privatizing police, prisons, military, firefighting, etc, all of which have many modern examples, things do not turn out well. As unreasonable as it may sound, the evidence suggests it's better to put all the eggs into one poorly run basket.

>Imagine something like a credit score 2.0, created by analyzing a lifetime of private communication, online activity, and transactional data....

Oh, I imagine.

https://news.ycombinator.com/item?id=12499525


> Most people use language woefully imprecisely. The relationship I have with the barista at the cafe near my office isn't the same as the relationship I have with my sister but it is a relationship of the kind that's relevant here. Knowing what I order and when, recognizing me, etc.

Yes but that is a very different type of relationship with quite different characteristics. I hope it isn't too difficult to infer I'm arguing not everyone wants these types of relationships. To call it "just another relationship" is not very helpful for the discussion.

This type of relationship may have significant extended and unforseen side effects. It's not well constrained and the preserved artifacts could easily be hijacked for countless unknown purposes decades in the future. It's a fundamentally new paradigm that we don't fully understand yet, and given humanity's historical tendency to abuse new mechanisms of power as they become available, I think some caution is very reasonable.

Perhaps to make my position a little more clear, a key point on why detailed data profiles could be quite dangerous is their scalable and programmatic nature. Never before could a single click of a button identify every individual who has been discussing topic X in the last year, or spit out a list of everyone with 2 degrees of connection to some targeted individual. The same unlimited possibilities that make this stuff exciting to technologists are also why it may be quite dangerous.

These powers are unprecedented. You would need a rotating team of investigators inside every home and every place of business in order to gather this data in previous eras, not to mention even trying to collate and process it. It's equivalent to someone in previous eras standing over your shoulder and writing down every newspaper article you read, taking notes on every conversation you have, etc. Because it is invisible, it doesn't feel this way, but that is what's happening.

> when we try to fragment this power by privatizing police, prisons, military, firefighting, etc, all of which have many modern examples, things do not turn out well

I never suggested we do that?


For me is way more bigger ethical issue is what millions sharing in social networks, on their own will.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: