What I don't think people are talking enough about yet, is that AI doesn't invent new ways of doing things -- it just predicts the next word based on the materials it was trained on. That means that if a company lets all its coding be done by AI -- that company will be permanently stuck in, say, 2026 -- while other companies will be continuously improving.
I don't think it is that simple. Innovation can be found on several levels, from the lowest next-token level up to the higher level of new ways of combining things. Surely LLMs can produce code that on the whole does something completely new, even if on a syntactical level it has all been seen before?
We only have a couple dozen letters, still it is possible to write new poetry.