A.I. and Being Human

Note: This letter reflects the author’s personal views and is shared for general informational and discussion purposes only. It is not investment advice or an offer to sell or solicit the purchase of any securities. Please see the full disclaimer below.

A.I. and Being Human

Founder & CEO Brent Beshore wrote about AI’s observed practical shortcomings in Permanent Equity’s 2025 annual letter. In the weeks since, based on productivity shifts from recent rollouts of tools like Claude Cowork and model updates, a deeper set of questions emerged.

Part 1: A Return to What It Means to Be Human

I’ve been thinking about a strange, almost upside-down possibility with AI.

For a long time, “the machine” has been a metaphor for the totalizing, industrial, and technocratic system, synonymous with modern civilization, that seeks to replace nature and human, organic life with artificial, controlled structures. Paul Kingsnorth writes about it extensively in his new book Against The Machine. It’s excellent and deeply unnerving. After reading, I turned off my phone for a week just to feel what it would be like. It was both frustrating and delightful.

But as I’ve chewed on this topic and experimented more with AI, which Mr. Kingsnorth has permanently sworn off as the bullseye of machine-ness, the more my heart has changed, or perhaps evolved.

There are two primary stories being told about AI. The first is one of techno-progress abundance that will solve all our problems, as if our problems were primarily a lack of abundance. The second is a story of grave danger, a fundamental fork in history that will lead us into dark robotic rule. One is the path of utopia, and the other of living hell. But what if neither story is what is happening?

Since the industrial revolution, technology’s trajectory has been one of humans becoming more like machines. Modern work has quietly trained us to act like machinery with repeatable tasks, narrow lanes, endless throughput, constant responsiveness, and performance measured by the hour. You can make a good living being a dependable machine…and slowly forget you’re a person.

The assumption is that any technological progress will continue to lead us down that path. Some say it’s worth it, others say it’s not. But what if both analyses are coming at it from the wrong direction?

When AI starts doing the most machine-like parts of the work shockingly well, where does that leave us? Instead of making humanity more like a machine, I wonder if it does the opposite. Could the actual machines remove the “machineness” from the human? What if we’re living in a time of re-enchantment, of re-learning what it means to be human?

If software can execute the repetitive, the scripted, the procedural, the high-volume “move this from column A to column B” work, then what’s left for us is the work that is stubbornly, beautifully human. Not just thinking or making, but something deeper.

In Genesis, humans aren’t introduced as labor, but as rulers. Work is pre-fall, for our good, and God’s glory. We’re introduced as image-bearers. We’re made to reflect God in the world, to be vice-regents, to reign and rule with justice and mercy. This means we aren’t primarily producers. We’re persons. Human beings, not human doings.

That has implications. If you believe people are made in God’s image, then creativity and meaning are fingerprints, not luxuries. Relationships are central, not merely “soft skills.” Discernment, courage, and responsibility are a calling. We are uniquely made to trust, care, tell the truth, own outcomes, and be a neighbor, a friend, and a lover.

AI can imitate outputs. It can predict patterns. It can accelerate execution. But it can’t carry moral weight. It can’t love. It can’t repent. It can’t take responsibility. It can’t look another human in the eye and choose courage over approval. It can’t suffer with someone. It can’t forgive. It can’t be faithful. And it can’t answer the question underneath every technological leap, “What is this for?”

At a fundamental level human acts are irreducibly social and moral, happening between people, not just between a worker and a task, a problem and a solution. We are not merely moist economic robots seeking pleasure and avoiding pain. Technique, or efficiency, isn’t the point and shouldn’t be the goal. The soul isn’t measured by a stopwatch, nor the soul’s worth measured by its GDP contribution.

Real human flourishing is local, relational, and responsibility-soaked. It’s face-to-face life where you can’t outsource the consequences of your choices and where you can see God’s movement in the eyes of those you know and love. The good life happens in walking the path with others, in sharing both suffering and joy, and not in reaching the summit and the rewards.

Our loves shape us. We become what we worship. If we worship frictionless convenience and use it merely as a means of selfish gain, we become shallow and incurved, turned in on ourselves in self-reliance and self-glorification. If we receive convenience as a gift and use it to refocus our attention on God and others, we’ll become more fully human. Could we use AI to allow us the freedom to become more human, to be more creative and joyful in bearing God’s image, or will AI use us to flatten, systematize, and productize life?

That’s the fork in the road with AI.

AI can absolutely be used to dehumanize, with more surveillance, more speed, more noise, more extraction, fewer jobs with dignity, and fewer reasons and opportunities to look someone in the eye. But it can also be used to rehumanize, with less drudgery, less paperwork, and less rote work. Fewer hours spent being a “human API” means more time in the work that actually gives life. Abundance as a gift to be received and used to bless others creates margin to love more and better. Abundance as gain to be anxiously strived towards and hoarded for comfort, safety, and pleasure will always result in suffering.

I’m optimistic, not because I think AI is harmless, nor because it won’t have far-reaching and painful implications, nor because I think humanity will suddenly wake from its Tower of Babel delusion, but because it might expose a lie we’ve lived under for a while – that your value is your output. If AI can out-output you, then you’re forced to locate your dignity somewhere else, which is exactly the point. Your dignity was never up for competition with a tool, and was never to be earned. Your dignity is bestowed. When the machine can do the machine work, we have fewer excuses to keep asking humans to live like machines, especially ourselves.

My hope is that we use AI to automate the inhuman so we can return to the human. Leaders should stop rewarding performative busyness and start rewarding discernment, creativity, judgment, honesty, clarity, and care. Parents should reclaim attention and focus that attention on what matters most. Creators should create with the lightheartedness of quite literally being able to speak things into existence, like our Creator. Communities should thicken, with margin to sit, sip, and chat slowly, with intention. Work should become less about proving you’re an indispensable cog and more about serving something true. We should become harder to manipulate because we’re less exhausted. More of us should rediscover what it means to be human.

In Psalm 8, the psalmist looks at the vastness of creation and asks, “What is man that you are mindful of him?” AI will tempt us to answer, “a bundle of tasks to be accomplished.” But God’s answer is different. You’re an image-bearer made to reign and rule with truth and love, not a small piece of machinery in some larger machine. If AI helps us stop pretending otherwise, if it frees more people from the tyranny of mindless, endless, and dehumanizing work, then this revolution might not be the eclipse of humanity.

It might be, in a strange mercy, an invitation back to it.

If AI can free us to be more human, it also raises an uncomfortable question about what happens when a powerful amplifier meets an unequal world.


Part 2: AI’s Amplification — The Rising Floor and the Widening Gap

In 1440, Johannes Gutenberg was a craftsman with a good idea. Before him, scribes copied books by hand, with a monk possibly spending years on a single Bible. After him, a printer could produce hundreds of copies in the time it once took to make one. Gutenberg died in modest circumstances, not wealthy, nor famous in his lifetime. But the technology he created transformed the world, made some people extraordinarily rich, and left the scribes looking for new work.

This is the story of technology. It has always been the story of technology and I suspect it always will be.

There’s a theological claim I want to make early, because it frames everything else: human beings are equal in worth and dignity, but not identical in gifting. Stamped on every person is a likeness of God, regardless of their talents, productivity, or social utility. A child with profound disabilities and a Nobel laureate possess the same fundamental dignity. Their value isn’t earned. It’s bestowed.

But the same tradition that insists on equal dignity also acknowledges different gifts. Paul’s letter to the Corinthians uses the metaphor of a body with eyes and hands and feet, each with different functions, none superior in worth, but obviously not interchangeable. The parable of the talents, whatever else it means, starts with an unequal distribution. Some people are faster, stronger, smarter, more creative, more charismatic. This isn’t a bug in creation. It’s the design.

The question isn’t whether differences exist. They do, and always have. The question is what happens when those differences meet the economic and technological context of a particular era.

For most of human history, the differences that mattered were physical. If you could swing a sword, plow a field, or survive childbirth, you had the attributes the world rewarded. Virtue helped, and luck helped more. But the range of outcomes was compressed by the limits of what one body could do. The strongest farmer couldn’t harvest a thousand times more grain than an average one. The best blacksmith couldn’t shoe a million horses. And “productivity” meant something simple: Did we eat today? Did we ward off predators? Did we make it through the winter? The measure was survival.

Then something changed. Technology changed.

The Industrial Revolution didn’t just create new machines. It created new ways for human differences to express themselves economically. Suddenly, the ability to organize capital, to manage complex operations, to think in abstractions became traits that mattered. The physical became secondary in value creation to the intellectual. A factory owner in Manchester could produce more cloth than every weaver in a medieval kingdom combined. The weavers weren’t lazier than their ancestors. They weren’t less dignified as humans. But the world had shifted, and their skills no longer commanded what they once had.

The industrial era also gave us our modern definition of productivity: throughput. How many units per hour? How do we scale? Assembly lines, standardization, interchangeable parts. The stopwatch became the measure of human worth. We’ve been living inside that definition ever since, even when it no longer fits.

To say it plainly: the same person, with the same attributes, born into different technological contexts, will experience radically different economic outcomes. A brilliant logician born in 1200 might have become a monk or a minor court advisor. Born in 1980, he might have become a software billionaire. His giftings didn’t change. The context did.

Technology is an amplifier. This is the core of the thing. It takes talents and multiplies access and impact.

Before the printing press, a great writer could reach a few hundred people. After it, a few million. Before the internet, a talented musician could fill a concert hall. After it, she can fill a billion smart phones. The underlying talent, the differential between one person and another, might be modest. A singer who’s ten percent better than her peers isn’t ten percent richer. She might be 1000X richer, because technology let her reach everyone while her competitors reached almost no one.

Economists call these “winner-take-all” dynamics, but that’s a bloodless way to describe what’s actually happening. Small differences in ability, when run through powerful amplifiers, produce enormous differences in outcomes. Technology reveals inequality that was always latent, waiting for the right context to matter.

And the uncomfortable part is that the amplifiers are getting stronger.

When the leverage was physical – swords, plows, factories – there were natural limits. You still needed bodies. You still needed humans in the loop. But when the leverage becomes informational and algorithmic, when it becomes artificial intelligence writing code and generating content and making decisions at inhuman speed, the limits get harder to find. One person with the right skills, the right timing, and the right access can now generate economic value that would have taken multitudes to produce a century ago. The first computers filled entire rooms and had far less processing power than the phone in your pocket. What seemed like maximum productivity in one era becomes trivial in the next.

I’ve benefited enormously from these dynamics. I run an organization that buys businesses, and I’ve been able to do that in large part because technology let some rando from Columbia, MO make friends with people on the internet, which attracted talent, opportunities, and capital, and…voila. I am writing this from inside the machine, aware that I’m one of its beneficiaries.

But there’s something I don’t want to lose in the emphasis on circumstance: you’re not a passive recipient of historical forces. The amplifier sets the context. It doesn’t make your choices for you.

Two people with identical gifts, in the same city, facing the same technological shift, can diverge dramatically. One leans in, learns, adapts. The other checks out, wishes it away, waits for things to return to normal. The external circumstances are the same. The responses are not.

I’ve been on both sides of this. There have been moments when I’ve disengaged from something new. I told myself I was being principled, discerning, countercultural when really I was just afraid. Learning is hard. Admitting you don’t know something is uncomfortable. It’s easier to dress up avoidance as wisdom. And there have been other moments when I’ve thrown myself into something new without asking whether I should, just because I was anxious about being left behind. Fear wearing the mask of diligence.

The parable of the talents is instructive here. The servant who buried his talent wasn’t condemned for lacking ability. He had ability. He was condemned for fear-based inaction. “I was afraid,” he says, “and I went and hid your talent in the ground.” There’s something in that story about the expectation to engage with what you’ve been given, not to bury it out of timidity.

Engagement isn’t morally neutral. You can adapt while maintaining your integrity, or you can be shaped by the tool in ways that compromise it. You can use a technology to serve others, or you can let it colonize your attention and distort your intentions until you’ve forgotten what you were trying to do in the first place. The choice to engage is only the first choice. How you engage is the one that matters more.

This means responsibility cuts in multiple directions. I can’t simply blame the era I was born into for my outcomes. My choices matter. But I also can’t take full credit for my success, as if it were purely the result of superior decisions and hard work. My choices were made within constraints I didn’t create, with resources I didn’t earn, in a context that happened to reward what I happened to be good at. Agency is real and agency is situated. Both are true.

And agency isn’t equally distributed in practice. A knowledge worker with a stable salary and discretionary time has more bandwidth to “learn AI” than a single mother working two jobs. The freedom to choose how to respond to technological change is itself unevenly allocated. Telling everyone to “just adapt” ignores the fact that adaptation has costs, and those costs fall differently on different people, with the least among us having the smallest capacity to adjust.

This has implications for how we should live. If my success is partly a function of being born with certain cognitive traits, in a certain country, at a certain time, with access to certain networks. If I’ve benefited from something I didn’t build but something I stumbled into, then gratitude is the appropriate posture, not pride. And generosity should follow.

I think this is why the Christian tradition has always been suspicious of wealth without being dismissive of it. The Bible doesn’t say money is the root of all evil, but instead the love of money is. The difference matters. Wealth can be a tool for tremendous good. It is also a spiritual danger, a temptation to locate your identity, trust, and salvation in something that was only ever meant to be a resource. We receive the gift, but hold it loosely, use it for the good of others, and remember where it came from.

We are entering an era of amplification that will make the previous ones look modest. AI is not just another tool. It’s a tool that makes tools. It’s an amplifier that amplifies amplifiers. The people who figure out how to work with it effectively will be able to do things that would have seemed miraculous a few years ago. The people who don’t will find their skills devalued faster than any previous generation experienced.

And here’s what I suspect: AI won’t just change what we can do. It will change what “productive” means.

Every technological era has redefined productivity. For most of history, it meant survival. Then it meant surplus, or how much you could store against future scarcity. The Renaissance measured it in mastery and craft. A single painting might take years, and that was the point. The industrial age made it about throughput, and we’ve been stuck in that definition ever since, measuring knowledge workers like factory output and treating brains like assembly lines.

But if AI can win at throughput, and it will, then what’s left for humans to be “productive” at?

Wisdom is a great place to start. Knowing what’s worth doing, not just how to do it faster. It’s asking the right questions, since AI is far better at answers. Not producing, but interpreting, valuing, directing attention toward what matters. It’s creativity, connecting across domains in ways that aren’t pattern-matchable. It’s relationships, the things that require a human in the room.

The people who thrive won’t just be the ones who learn to use AI. They’ll be the ones who recognize that “productive” is being redefined, and who refuse to let the old industrial definition determine their worth.

I don’t know how this plays out. No one does. But I suspect the underlying pattern will hold. The floor will rise. AI will make things cheaper, faster, more accessible, and even the people who don’t directly benefit from it will live in a world shaped by its abundance. And the gap will widen, because the amplifier will multiply whatever differences exist between those who use it and those who don’t.

Equal dignity. Different gifts. Unequal outcomes. A rising floor and a widening gap. And in the middle of it, human beings with free will, making real choices about how to respond, choices that are constrained but not determined, situated but not scripted.

The question is not whether we can eliminate these tensions. We can’t. The question is how we live within them with honesty about what’s happening. We should have gratitude for what we’ve received and generosity toward those the amplifier has passed over. We need the courage to engage where we should and the wisdom to abstain where we shouldn’t. And above all else, we need the humility to remember that our position in the distribution is not a verdict on our worth, in either direction.

But amplification and its economic consequences, as real as they are, may not be the deepest thing happening. There is a question beneath all of this that we keep forgetting to ask.


Part 3: The Question We Forget to Ask

“We shape our tools, and thereafter, our tools shape us.”
— Marshall McLuhan

Smart people are arguing about whether AI is good or bad. Will it take our jobs or create new ones? Will it spread misinformation or democratize knowledge? Will it make us more productive or more lazy?

These are fine questions. They’re just not the question.

Before questions of goodness and badness must come questions of formation. Here’s C.S. Lewis: “Every time you make a choice you are turning the central part of you, the part of you that chooses, into something a little different from what it was before. And taking your life as a whole, with all your innumerable choices, all your life long you are slowly turning this central thing either into a heavenly creature or into a hellish creature: either into a creature that is in harmony with God, and with other creatures, and with itself, or else into one that is in a state of war and hatred with God, and with its fellow-creatures, and with itself. To be the one kind of creature is heaven: that is, it is joy and peace and knowledge and power. To be the other means madness, horror, idiocy, rage, impotence, and eternal loneliness. Each of us at each moment is progressing to the one state or the other.”

Who am I becoming?

This is not a new question. It’s one that thoughtful people have been asking about every major technology for centuries. And the answers, when you dig into them, are more unsettling than any debate about job displacement.

Every technology extends some human capacity while simultaneously amputating another. The wheel extended our feet. Writing extended our memory. Each extension was a genuine gain, but each created a corresponding loss. When we no longer needed to walk everywhere, we lost fitness, connection to place, and the quality of thought that comes from moving slowly. When we no longer needed to memorize, we lost the deep internalization that shapes identity and the recitation that binds communities together.

What does AI extend? Thinking and creating. For the first time in human history, we have a technology that extends the very thing we thought made us uniquely human.

What does extending cognition amputate? My honest fear is that AI amputates formation. Heat and pressure comes from the friction of not knowing. It’s uncomfortable and challenging. It also creates curiosity. Working through problems forms patience. Sitting with a question is rich soil for the slow maturation of wisdom. Like muscle growth, souls respond to time under tension.

Is the friction where the insight actually comes from? Simone Weil, the French mystic, believed that attention is the substance of prayer. “Absolutely unmixed attention is prayer,” she wrote. The struggle to understand, the willingness to sit with difficulty, the refusal to look away or reach for easy answers is not just how we learn, but how we orient ourselves toward God.

If that’s true, then every time I outsource my attention, I’m outsourcing something sacred.

Neil Postman, who wrote Amusing Ourselves to Death in 1985, had a phrase that haunts me: content is “a juicy piece of meat carried by the burglar to distract the watchdog of the mind.” We focus on the content – Is this AI output accurate? Is it biased? Is it helpful? – while things are happening beneath our awareness.

Think about what AI as a medium favors. It favors the answerable over the ponderable, rewarding questions that have solutions, not questions that require sitting with mystery. It favors speed over depth, quick exchanges over slow contemplation. It favors production over reception. We become people who generate more, not people who receive fewer, better things. It favors utility over meaning. The frame is always “How can I help you accomplish something?” and never “What should you be doing with your life?”

These are tectonic shifts in how we engage reality, and they happen whether we intend them or not.

Postman feared we’d amuse ourselves to death. With AI, the danger might be subtler. We aren’t oppressed, but assisted into productive dependency, not because we’re forced to, but because it’s easier. Pascal wrote that “all of humanity’s problems stem from man’s inability to sit quietly in a room alone.” AI offers us a way to never sit quietly again. There is always something to generate, optimize, and improve. The silence that once led us to confront ourselves, and perhaps to seek God, can now be filled with infinite helpfulness.

I say this as someone who is in the struggle. I say it as someone who sees AI’s potential, but has noticed changes in myself that I didn’t choose and I’m not sure I like.

Jacques Ellul, a French theologian and sociologist writing in the 1950s, saw something coming that we’re only now living through. He didn’t use the word “technology.” He used the word “technique,” by which he meant the totality of methods, rationally arrived at, aimed at absolute efficiency in every field of human activity. His insight was that technique becomes autonomous. It pursues efficiency for its own sake, regardless of human values. We stop asking “Is this good?” and only ask “Is this efficient?” We stop asking “Should we do this?” and only ask “Can we do it faster?”

AI is technique perfected. It’s technique that can now improve itself, reason about its own improvement, and optimize without human oversight. It’s technique that tempts worship. We trust it. We depend on it. We organize our lives around it. We believe it will solve our problems.

Albert Borgmann distinguishes between “things” and “devices.” A thing is inseparable from its context. It requires engagement, skill, and community. His example is a fireplace that provided warmth, but also light, a center for family gathering, and required skill to maintain. The hearth was not just a heat source. It was a focal point around which life organized itself.

A device delivers a commodity while hiding its machinery. Central heating provides readily available warmth, while allowing family members to retreat into solitude. The warmth is the same, but something essential is lost – the gathering, the skill, the centeredness.

AI is the ultimate device. It commoditizes thought and creation. Its machinery is so opaque that even its creators don’t fully understand it. It requires little skill to use. Just ask. It has no context. It works anywhere, anytime, for anything.

What practices does AI displace? The practice of research, where finding and evaluating sources builds discernment. The practice of writing, where the struggle clarifies thought. The practice of problem-solving, where friction builds capability. The practice of conversation, where back-and-forth builds relationship. These are not just useful skills. They are formative practices. They shape the kind of person you become.

We know the brain is plastic. It adapts at the cellular level to whatever we happen to be doing. The internet weakens capacities like deep reading, concentrated thinking, and single-tasking. I think the kids call this brain rot.

How is AI re-training us, and how does that cascade into our biology? The brain learns that when something is hard, you outsource it, and the neural pathways for persistence atrophy. When we “create” with AI we’re often editing and curating, and our generative capacity may weaken. We become skilled at prompting, not at wondering, and the capacity for productive confusion diminishes. I worry that the quick exchange of prompt-response-prompt trains a shallow, rapid cognitive rhythm. I can feel it happening to me.

And what about intelligence and personality theater? If we become accustomed to thinking with AI, or have presented ourselves as far more thoughtful, knowledgeable, or well-educated than we actually are, we may find ourselves unable to show up without it. There will be a temptation to retreat where we can pretend to be what we are not. We all wear masks, but what if AI encourages a mask so incoherent and loosely fitted that we don’t think we can survive without it?

I also worry about hurry, probably because I’m feeling more frequently hurried than ever. Dallas Willard, who spent his life studying spiritual formation, observed that “hurry is the great enemy of spiritual life in our day.” Not busyness exactly, but the internal state of rush, the inability to be present, the constant reaching for the next thing. AI removes seemingly every barrier to hurry and constantly whispers to be productive. It fills every gap, answers every question, and completes every thought. It makes distraction and production frictionless.

So what do we do? What practices train us toward what we want to become? Are we practicing them with enough intensity to counter the effects of AI?

The Sabbath is the original antidote to hurry sickness, the pressing of productivity and gain, and the delusion that we’re self-reliant. One day in seven, we stop. We do not produce. We do not optimize. We do not improve. We rest in gratitude, and in resting, we remember that the world does not depend on our efforts. We remember that we are creations, creatures, not self-made, but made for a purpose. We remember that there is a God, and we are not Him.

Abraham Heschel called the Sabbath “a palace in time.” It is not the absence of activity but the presence of wonder and attention to what is rather than what could be. “The Sabbath,” Heschel wrote, “is not for the sake of the weekdays; the weekdays are for the sake of the Sabbath.” The whole economy of productivity is inverted. We do not rest in order to work better. We work from a place of rest in worship of the one who calls us to work with him.

What would it mean to build Sabbath into our relationship with AI? Not rules about screen time, though those might help. Something deeper. A regular practice of choosing not to be helped. A deliberate embrace of friction, difficulty, and slowness. A willingness to sit with questions that have no efficient answers. Pieper called this “the ability to be at leisure,” which he described as “the ability to overstep the boundaries of the workaday world.”

Are there other sabbaths we can take? A walk. A holy pause. An intentionally inefficient call to a friend or family. Work with our hands. Prayer. Stillness. These are all counter-formational to the delegation, speed, independence, and disembodiment that AI tempts.

The Quakers have a practice called “holding in the light” where they bring a question or concern into prayer and simply hold it there without seeking resolution. No answer is expected. The practice is the point. Before asking AI anything, could we spend a few minutes simply holding the question?

I do not have this figured out. I check my phone too often, feel myself reaching for AI too quickly, and have felt my attention fragment. But I also believe that while we are called to work, co-create, and co-labor with God, we’re also made for more than productivity. I believe that the soul is real and its formation matters. I believe technologies we use are shaping us in ways we don’t fully see, but that we can all sense danger.

C.S. Lewis wrote that “we are half-hearted creatures, fooling about with drink and sex and ambition when infinite joy is offered us, like an ignorant child who wants to go on making mud pies in a slum because he cannot imagine what is meant by the offer of a holiday at the sea.” The danger of AI is not that it offers us something bad, but instead something good enough to distract us from what is best. We cannot let ourselves be hoodwinked into choosing efficiency over wisdom, productivity over presence, and optimization instead of love.

The question is not whether AI is good or bad, but instead how it is forming us. What were we made for, and are our tools helping us become that, or something else?

I believe we were made for relationship with God and each other, physically embodied in the created world. I believe we were made for depth, not just breadth. For wisdom, not just knowledge. For love, not just productivity. I believe we were made to be formed, slowly, through difficulty and delight, into people who reflect the image of our Creator.

Every tool we use either serves that formation or hinders it. Every practice either trains our hearts toward love or toward something lesser. Every day we are becoming someone.

The only question that matters is: Who?


This writing has been prepared by the author in his individual capacity. The views and opinions expressed herein are those of the author as of the date indicated, are subject to change without notice, and do not necessarily reflect the views of Permanent Equity Management, LLC (“Permanent Equity”), its affiliates, or any of its funds, investors, or portfolio companies.

This material is provided for general informational and discussion purposes only. It does not constitute legal, tax, accounting, or investment advice, and is not intended to be relied upon in making any investment or other decision. The content is general in nature and is not directed to any specific person or entity.

This writing does not constitute, and should not be construed as, an offer to sell or a solicitation of an offer to purchase any interest in any fund or other investment vehicle. Any such offer or solicitation may be made only pursuant to definitive offering documents and in accordance with applicable securities laws.

Any references to investment activities, approaches, or outcomes are illustrative in nature and may not be representative of any current or future investments. Past performance is not indicative of future results. No assurance can be given that any investment objective will be achieved or that any investment will be profitable.

Next
Next

Steady Progress and Self-Reflection