That made me imagine -- in the future when AI is much more advanced, maybe I could just prompt it with say "something sentimental to make my wife cry." I mean, I still came up with the idea and ultimately it's the thought that counts right. What's the limit here? Is this some sort of human emotion exploit, or a legitimate bonding experience?
It’s rarely the thought that counts. It’s the committed effort. Presents aren’t just nice because they needed those socks. More importantly, they’re a signifier that you consider the person to be worth thinking about. You value them enough to spend time and effort thinking about them. Then you followed through. This is why we don’t just give people money as a present.
The effort that you put in is often what people like most about a gift. Don’t try too hard to hack around that.
I'm going to draw this example out to make it more realistic.
"Say something sentimental to make my wife cry" you prompt. The computer comes back:
Ok, tell me a few things about your wife. How did you meet? What are her favorite things? Tell me about some great moments in your relationship. Tell me about some difficult moments in your relationship.
Ok, tell me a few things about you. What do you love about your wife? What have you struggled with?
Ten minutes of this kind of conversation and I'll bet the LLM can generate a pretty good hallmark card. It might not make your wife cry but she'll recognize it as something personal and special.
Four hours of this kind of conversation and you might very well get some output that would make your wife cry. It might even make you cry.
The work is adding context. And getting people to add meaningful, soul-touching context is not easy - just ask any therapist.
1. Wives aren't a monolith. The prompt is underspecified, or else individual taste and preciousness is dead.
2. No matter how good the tech today is (or isn't) getting, the responses are very low temperature. The reason it takes a human 4 hours to write the poem is because that is time spent exploring profoundly new structures and effects. Compare this to AI which is purpose-built to hone in on local optima of medians and clichés wherever possible.
> I mean, I still came up with the idea and ultimately it's the thought that counts right. What's the limit here?
Sociologically, devoid of AI discussion, I imagine the limit is the extent to which the ideas expressed in the poem aren't outright fabrications (e.g. complimenting their eyes when really you couldn't care less). As well, it does not sit right with humans if you attempt to induce profound feelings in them by your own less-than-profound feelings; it's not "just the thought," it's also the effort that socially signals the profundity of the thought.
Usually they are. Most people are surprisingly similar and predictable, which is why basic manipulation tactics are so successful. Sure, you have 10% of people who truly are special, but the other 90% has a collective orgasm while listening to whatever is the hottest pop star.
> The reason it takes a human 4 hours to write the poem is because that is time spent exploring profoundly new structures and effects.
Most likely dude spent 4 hours doing exactly the same things that everyone else does when making their first song. It's not like within these 4 hours he discovered a truly new technique of writing lyrics. Each instance of human life that wants to write songs needs to go through exactly the same learning steps, while AI does it just once and then can endlessly apply the results.
> it's not "just the thought," it's also the effort that socially signals the profundity of the thought.
In close relationships yes. When dealing with those you less care about, it's the result that matters.
I think expended effort is what counts here for these types of interactions, and how much of that effort is tailored to the specific person.
I mean, we're almost always standing on the shoulders of other people, and we're almost always using tools. But if the output is fully mechanical and automatic without being tailored for the specific person, it's hard to see it as personal in any way.
That made me imagine -- in the future when AI is much more advanced, maybe I could just prompt it with say "something sentimental to make my wife cry." I mean, I still came up with the idea and ultimately it's the thought that counts right. What's the limit here? Is this some sort of human emotion exploit, or a legitimate bonding experience?