AI Wealth Truth (78): Why Interviews Are a Game Where Both Sides Lie
A Nash equilibrium in information games: you exaggerate ability, the company exaggerates benefits. Both know it, but cannot stop
I. In interviews, you show the best version of yourself. You hide weaknesses, amplify strengths, package experiences. Do you think the company does not know? They know.
II. At the same time, companies show the best version of themselves. They emphasize growth opportunities, downplay overtime culture, beautify compensation structures. Do you think you do not know? You know.
III. Both sides know the other is exaggerating. But both keep exaggerating. This is a game-theoretic equilibrium.
IV. Let us analyze the game:
V. Your choices: Option A: be fully honest. Admit weaknesses and describe yourself accurately. Option B: moderate packaging. Emphasize strengths and soften weaknesses.
VI. The company's choices: Option A: be fully transparent. Admit problems and describe the work environment accurately. Option B: moderate packaging. Emphasize advantages and downplay problems.
VII. If you are fully honest while other candidates package themselves: you are at a disadvantage. You may not be hired. Honest people get eliminated.
VIII. If a company is fully transparent while other companies package themselves: it cannot attract talent. Candidates choose the company that looks better. Transparent companies get eliminated.
IX. So what is the result?
X. Nash equilibrium: everyone packages. No one dares to be honest first. Because honesty is punished. This has the structure of a prisoner's dilemma.
XI. Everyone knows this game. Candidates know companies are beautifying reality. Companies know candidates are packaging themselves. But no one can change the rules unilaterally.
XII. What problems does this create?
XIII. Information distortion. After joining, both sides discover reality differs from expectation. The work is different, the culture is different, the compensation structure is different. Both sides feel deceived.
XIV. Inefficient matching. The wrong people get hired. The best roles get missed. Time and resources are wasted on bad matches. Social resources are lost.
XV. Trust costs. Because you assume the other side may be lying, you need extra verification. Background checks, probation periods, multiple interview rounds. These are costs created to fight deception.
XVI. In the AI era, this game escalates.
XVII. AI helps you package better. ChatGPT can help you write a perfect resume and cover letter. Mock interviews can help you practice answers. Packaging becomes cheaper and more common.
XVIII. Companies use AI to screen. AI analyzes resumes, evaluates candidates, predicts performance. But AI can also be fooled by AI-optimized resumes. The game around AI screening escalates.
XIX. Truth becomes harder to tell. AI-written reference letters, AI-optimized project descriptions, AI-generated portfolios. What did you do, and what did AI do? Verification costs rise.
XX. Is there a way out?
XXI. Method 1: verifiable records. Not what you say you did, but public evidence of what you did. GitHub commit history, publicly published content, traceable results. Let results speak and shrink the space for claims.
XXII. Method 2: long-term relationships reduce deception. Referrals are more reliable than cold applications, because the referrer's reputation backs it. Long-term partners are more reliable than strangers. Repeated games reduce incentives to lie.
XXIII. Method 3: probation and gradual trust. Do not make one huge decision at once. Start with a small project to verify fit. Reduce the impact of information asymmetry.
XXIV. As an individual, how do you play this game?
XXV. 1. Moderate packaging is necessary. Being fully honest is a losing strategy in this game. Understand the rules and optimize within them. But do not cross ethical lines.
XXVI. 2. Build unfakeable signals. Compared with "I say I can", "I have done it" is stronger. Accumulate public, verifiable achievements. Signals are more credible than statements.
XXVII. 3. Do reverse due diligence on the company. You can investigate the company too. Look at employee reviews, turnover, real work content. Information games go both ways.
XXVIII. Interviews are a game where both sides lie. Not because people are naturally dishonest. But because the game structure incentivizes deception. In this structure, honesty is punished. In the AI era, deception is easier and verification is harder. You need a better strategy. Build unfakeable signals. Build long-term relationships with credible people. Find the real in a world full of packaging.
AI Wealth Truth (77): Why "Signals" Matter More Than "Ability" for Your Income
Spence signaling theory: the value of a degree is not knowledge, but proof you can survive selection. It is a costly signal
AI Wealth Truth (79): Why Referrals Are 100x More Effective Than Cold Applications
Information filtering in social networks: referrals solve adverse selection, and referrers back you with their reputation
AI Practice Knowledge Base