"Nothing so dates an era as its conception of the future."
—Brian Eno
Consider the possibility that super intelligent AI or AGI is not only technically feasible but coming in a matter of decades.
Suppose that when it arrives, it will combine human-like reasoning with superhuman speed, perfect recall, and arbitrary scalability—a country of geniuses in a data center.
Can you imagine how the economic value of your labor, if unadapted, will slowly trend towards zero over the next decade? Maybe the concept of employment itself will undergo a deep societal and cultural transformation.
Not withstanding societal safety nets like UBI, how should the individual best navigate this transition?
Go Upstream
How attached are you to your craft?
I don’t mean what you do, I mean how you do it. You may have beautiful handwriting, with perfect inter-character spacing and consistent letter heights, with just the right, tasteful mix of cursive, casual scrawl, and precision, but when you need to get a lot of writing done, you go to a keyboard and start plonking words on a document.
On the keyboard you express your craft in a different way; it’s the words you choose, how you put those words together to form sentences, the ideas you choose to articulate, the perspective from which your ideas are formed, the world you see.
This is the hierarchy of craft. It starts with doing and ends with seeing.
At each level, there is a way for you to express your craft. Great researchers have something called research taste, which has much written about it, and is essentially the intuition for deciding which problems to pursue. A “filmmaker whose personal influence and artistic control over a movie are so great that the filmmaker is regarded as the author of the movie” has a special name in our culture, an auteur.
At the time of this writing, AI is starting from the bottom and eating its way up this hierarchy.
In a post-AI world, you should relinquish attachment to expressions of craft lower in the hierarchy.
There is more leverage upstream. The problems there deal more directly with meaning, purpose, and judgment that remain uniquely human domains. Your competitive advantage lies upstream.
If you’re a builder early in your career, are you confident that you can out-climb AI in a race up this hierarchy? The traditional path takes many years to climb from junior individual contributor, doing low-level craft, to seniority, where higher-craft is expressed through optics, politics, and decision making.
If there are other paths, start hedging risk by keeping them open. If you see none, investigate your constraints and sense of agency.
The ultimate safe harbor is optionality—having more than one way to engage with the market. You can build towards that by investing in yourself, not your employer, who’s allegiance is to the market. Build relationships, expand and deepen your network. Develop your skills, don’t build your castle in other people’s kingdoms.
Assets will evolve
Curation becomes sacred when creation becomes commoditized. The rise of NFTs illustrated this phenomenon extremely vividly in 2021.
The forms of asset that endure are social assets. These are assets derived from trust, such as personal brands, audiences, reputation, network.
One consequence of this is that trustworthiness becomes a sort of personal moat — a trust moat. As your competitive advantage over the tide of AI erodes, what will increasingly matter is how many people appreciate and like what you do, not because you are the best at it, but because they’ve come to trust you, like your favorite barber, who you know will get the job done, even though there might objectively be better ones in your city.
This also means that being able to build trust becomes increasingly important, and so I can see certain traits becoming more valuable: agreeableness, kindness, funniness, having good taste.
We will evolve
Embracing an attitude of lifelong learning with a focus on adaptability and interdisciplinary knowledge will serve you well. More importantly, cultivating high agency will be critical. In a world where the individual is hyper leveraged through being able to throw AI at anything, the ability to decide for yourself what to do and to take action despite uncertainty are strong differentiating factors.
Uniquely human skills like emotional intelligence, ability to connect/communicate/collaborate with others, leadership, humor, and storytelling could become increasingly important as the focus of society increasingly accumulates towards the top of Maslow’s hierarchy.
I think it’s likely that in one way or another, we will all become leaders of AI in our personal and professional lives. You might consider the current paradigm of prompt engineering as a primitive form of this AI leadership, in which humans act as operators that direct the flow of algorithmic intelligence. If so, then the ability to think and write clearly, as well as communicate effectively will become even more important.
Risk mitigation
Consider Pascal’s wager, a 17th century philosophical argument that advocates for a rational person living in accordance with the existence of God because of the overwhelming risk asymmetry: if God doesn’t exist they incur only finite losses but if God does exist they incur either infinite loss or derive infinite gain.
Now consider the modern version with our hypothetical machine god. Even if you don’t believe that AGI is possible, or don’t agree with the current timelines and think it is much farther away, it might still be worth mitigating your future risk by taking some proaction now.
I believe the outcomes for individuals at the end of the decade will largely be defined by how they positioned themselves towards the start of it, so look bravely ahead, stay curious and diversify yourself.
Maintain flexibility in skills and roles, don’t put your eggs in one basket
Invest in relationships, communities, and other social assets
Develop communication and leadership skills
Stay informed about AI developments and try to avoid competing with AI, seek instead to work with it.
Go upstream.