I'm sorry, I don't get your point at all, and have no idea what you mean by "did this". If you asked for an embedding, you would have gotten a 768 (or whatever) dimensional array right?
For word2vec I know that there's a bunch of demos that let you do the king - man + woman computation, but I don't know how you do this with modern embeddings. https://turbomaze.github.io/word2vecjson/
You can test this hypothesis with some clever LLM prompting. When I did this I got "male monarch" for "king" but "British ruler" for "queen".
Oops!