> Playing short term games isn't very smart, especially for companies like Google. Or maybe I completely misunderstand the environment we live in.
It could be the principal-agent problem. The agent (employee and management) is optimizing for short-term career benefits and has no loyalty to Google's shareholders. They can quit after 3 years, so reputation damage to Google doesn't matter that much. But the shareholders want agents to optimize for longer-term things like reputation. Aligning those incentives is difficult. Shareholders try with good governance and material incentives tied to the stock price with a vesting schedule, but you're still going to get a level of disalignment.
I suppose this is where a cult-like culture of mission alignment can deliver value. If you convince/select your agents (employees) into actually believing in the mission, alignment follows from that.
Yeah I think that makes some sense. But you would think the CEO and top execs of the company would be trying to balance these forces rather than letting one dominate. You need pressures for short term but you can't abandon long term planning for short term optimization. Anyone who's worked with basic RL systems should be keenly aware of that and I'm most certain they teach this in business school. I mean it's not like these things don't come up multiple times a year.
There's some other explanations too. Maybe they thought the deception would fly under the radar, so it was rational according to cost-benefit analysis given available information. Maybe they fell for the human psychological bias of overvaluing near-term costs/benefits and undervaluing long-term costs/benefits. Maybe some deception was used internally when the demo was communicated to senior execs. Maybe the ego of being second place to OpenAI was too much and the shame avoidance/prestige seeking kicked in.
It could be the principal-agent problem. The agent (employee and management) is optimizing for short-term career benefits and has no loyalty to Google's shareholders. They can quit after 3 years, so reputation damage to Google doesn't matter that much. But the shareholders want agents to optimize for longer-term things like reputation. Aligning those incentives is difficult. Shareholders try with good governance and material incentives tied to the stock price with a vesting schedule, but you're still going to get a level of disalignment.
I suppose this is where a cult-like culture of mission alignment can deliver value. If you convince/select your agents (employees) into actually believing in the mission, alignment follows from that.