Artificial Intelligence (AI) has become an essential tool for many small companies, especially in areas like coding and product design. Recently, I approved the renewal of my team’s AI tokens, which are used to support our development processes. The integration of AI into our workflow has significantly transformed how we approach tasks, making it difficult to imagine working without it.
In the past, turning a product idea into something tangible could take one to two weeks. Now, with the help of “vibe coding,” where AI assists in writing code based on natural language instructions, this process can be completed in just an afternoon. This shift highlights the rapid evolution of AI and its growing impact on various industries.
Within a few years of its invention, AI has brought enormous convenience to people across different fields. From writing to programming, individuals can now produce work faster, more smoothly, and at a higher baseline quality. However, this advancement raises a recurring question: Is AI killing creativity?
At first glance, the question seems reasonable, but it points at the wrong target. No machine is stealing human imagination. If anything is being eroded, it does not come from hostile technology; rather, it stems from how people embrace convenience. AI does not kill creativity by thinking for us. Instead, it does something subtler. It takes people straight to a near-best version, removing the evolution of thought—the process where ideas take shape, get tested, go off track, and get rebuilt.
A programmer is coding with the help of AI. Photo by Pexels
This phenomenon is evident in writing. AI can produce fluent, grammatical, and logically sound paragraphs. Writers now start from a higher baseline than ever before. However, when they no longer struggle with sentences, something else fades: a sense of the weight of words. In the past, each draft and revision helped writers refine precision and push an idea to the sharpness they wanted. When that process disappears, the skill weakens. They can still produce good text, but they lose the ability to state exactly what they mean, and even the ability to ask AI to do so. As expressive ability declines, the starting point of every request becomes vague.
The rise of vibe coding shows a similar pattern in software development. After the early excitement about speed, engineers face a flood of “best” versions from AI. Each one works, and each looks optimal within the given request. However, the limit does not lie in AI’s ability to solve problems. It lies in the request itself. When engineers no longer build logic from the ground up, a process that once forced them to understand systems at a basic level, their ability to define problems precisely begins to weaken.
AI can expand the space of answers, but not the space of questions. An engineer can generate many “correct” versions, yet all are bounded by the quality of the prompt, which is bounded by the engineer’s own ability. Without training through intermediate steps, that boundary does not move. It only repeats faster, in more forms. When systems face new conditions, the ability to diagnose and rebuild also declines.
People often measure creativity by the final product, but this misses its nature as a process. An idea rarely appears fully formed. It begins as fragments, gets questioned, and is revised again and again, sometimes getting worse before it improves. That cycle of error and correction is where thinking is reorganized.
AI does the opposite. It presents an optimized version almost instantly: coherent, logical, and aligned with past data. There are no steps backward, no in-between states, no initial instability. The result is better and faster. The internal movement of thought disappears.
A similar pattern appears in agriculture when farmers move away from manual tools. Machines raise productivity, standardize processes, and stabilize results. But another kind of knowledge fades: the repeated, daily contact with the soil. Without working the soil each day, farmers lose sensitivity to small changes, shifts in moisture, soil structure, and unusual crop behavior. They can still farm effectively as long as conditions remain stable. When the environment changes, the issue is not the tools. The issue is that they no longer recognize that the soil has changed.
In every case, the problem lies in the mechanism, not the tool. When a system automates well enough to replace intermediate steps, it does not just optimize outcomes but also cuts off the process through which people continuously adjust their understanding based on reality.
Creativity is not just about finding the best answer within a fixed system, but is the ability to see when the system itself no longer holds. Every optimized system assumes stable conditions. When that assumption breaks, what becomes outdated is not the result, but how people understand the problem.
AI only helps people find answers faster and not to see when the question needs to change. Overusing AI can create the feeling of reaching a peak, but it is not the peak of individual ability; it is merely the peak of averaged data. Every optimized baseline is also a limit.
Creativity is not about being placed at the highest point. It is the process of climbing, slipping, falling, and climbing again. When that motion is replaced by an optimized launchpad, people may reach better results in the short term at the cost of their ability to push limits in the long term.
The risk is not that AI produces good text. The risk is that users get used to starting from an optimal point. When the starting point is always high, the ability to climb weakens. When everything arrives nearly finished, the muscles of revision atrophy.
When people stop training their ability to express, they lose control not only over the product, but over the question itself.
The answer is not to turn away from AI; it is to deliberately preserve the parts of the process that technology removes. It is not to compete with machines, but to retain the ability to sense, express, and detect deviations.
Keep writing and revising instead of accepting ready-made output. Keep building and understanding systems instead of only steering them through prompts. And, like in farming, do not lose direct contact with the “soil” of your own field.
