I don’t think “performance reviews” in the traditional sense are a silver bullet for finding the sorts of employees to invite to such a retreat. They’re good for finding “top performers” (by definition - you define what performance looks like, score your employees on it, and the ones with the highest score are your “top performers”). I personally don’t think that’s particularly useful, and it’s almost completely useless from the perspective of trying to find up-and-coming employees, or insightful employees, or whoever else is most useful to the company at such a retreat.
Finding “true” top performers requires much more effort than your average mid-large company wants to spend. You need a collaborative effort between individual contributors, line managers, VPs, and the C-suite to get a good idea of who the most important people in the company are. And part of that is politics (making sure that everyone knows who you are and why you’re important) - there’s always a chance that a super critical person slips under the radar.
As an engineer I can tell you who the most important people in my section of the org chart are off-hand, in order of importance. As in, people without whom my org would fail, or at least flounder. Some of them are engineers, some of them are sales/business liaison, some of them are “management” (but frequently do a hell of a lot more than “just” manage). I don’t think a traditional performance review would identify all of them as being mission-critical, but it would probably identify them all as performing at least as expected for their roles.
I know at least one fairly critical junior engineer[0] that maybe isn’t the fastest or most efficient worker, but is learning fast and could easily be a top-tier engineer with the right support and growth opportunities. Fortunately my company did identify this and has given them a number of opportunities (including an invite to the retreat, among many others), but a standard performance review would probably score them as “adequate” or whatever other mid-tier. The way this happened for us is frequent “skip-level” meetings, 1-1 or 1-many, as well as quarterly team retreats with all members (up to the VP) present. This allows 3 levels of employees to all interact, and if you keep on top of this as a senior manager or VP (with either an engineering background or an open mind, which all people in this scenario have) it becomes fairly straightforward to find these people. It’s not a formula, or a set of rules/identifiers, or even some guidelines that are followed, which makes it tricky to make this “fair” - it relies on everyone involved having good judgement.
[0] I know “critical junior engineer” doesn’t sound great, but it’s what happens when you have a rag-tag team of engineers building a product from the ground up with minimal support from the rest of the company (and thus few resources), and then hit it big and suddenly have a largely successful product on your hands that was written by essentially 4 people. Each one of those 4 people are non-overlapping domain experts, and it just so happens that 3 of them are senior engineers and 1 is a junior. Over the last 6 months this has been improving, but a year ago, it literally was 1 critical junior engineer who was the only person who knew about 20% of the product.
Finding “true” top performers requires much more effort than your average mid-large company wants to spend. You need a collaborative effort between individual contributors, line managers, VPs, and the C-suite to get a good idea of who the most important people in the company are. And part of that is politics (making sure that everyone knows who you are and why you’re important) - there’s always a chance that a super critical person slips under the radar.
As an engineer I can tell you who the most important people in my section of the org chart are off-hand, in order of importance. As in, people without whom my org would fail, or at least flounder. Some of them are engineers, some of them are sales/business liaison, some of them are “management” (but frequently do a hell of a lot more than “just” manage). I don’t think a traditional performance review would identify all of them as being mission-critical, but it would probably identify them all as performing at least as expected for their roles.
I know at least one fairly critical junior engineer[0] that maybe isn’t the fastest or most efficient worker, but is learning fast and could easily be a top-tier engineer with the right support and growth opportunities. Fortunately my company did identify this and has given them a number of opportunities (including an invite to the retreat, among many others), but a standard performance review would probably score them as “adequate” or whatever other mid-tier. The way this happened for us is frequent “skip-level” meetings, 1-1 or 1-many, as well as quarterly team retreats with all members (up to the VP) present. This allows 3 levels of employees to all interact, and if you keep on top of this as a senior manager or VP (with either an engineering background or an open mind, which all people in this scenario have) it becomes fairly straightforward to find these people. It’s not a formula, or a set of rules/identifiers, or even some guidelines that are followed, which makes it tricky to make this “fair” - it relies on everyone involved having good judgement.
[0] I know “critical junior engineer” doesn’t sound great, but it’s what happens when you have a rag-tag team of engineers building a product from the ground up with minimal support from the rest of the company (and thus few resources), and then hit it big and suddenly have a largely successful product on your hands that was written by essentially 4 people. Each one of those 4 people are non-overlapping domain experts, and it just so happens that 3 of them are senior engineers and 1 is a junior. Over the last 6 months this has been improving, but a year ago, it literally was 1 critical junior engineer who was the only person who knew about 20% of the product.