Knowledge of coding (or, as some call it, coding literacy) can also help in other ways. It can mean you are quicker to learn other aspects of technology, as well as being more digitally fluent. In today's increasingly digital job market, this can only be beneficial. There is nothing wrong with agility or simplification, but learning to code (especially on your own) requires dedication and time.
Over the past 2.5 years at Toptal I learned more about the industry and had the opportunity to go to conferences like TechCrunch Disrupt, where I could see the scope of opportunities and innovation. In other words, learning machine learning - and computer science in general, which of course is quite a bit broader than programming or coding - might make more sense from a futuristic perspective than spending non-renewable time learning how to make a web app (unless you're willing to compete with millions of developers overseas and want to risk being automated a decade from now). Four out of five of the largest US companies (by market capitalisation) are software-based tech companies and, for the last ten years or so, it seemed that studying computer science or at least "learning to code was like El Dorado to becoming "the next Mark Zuckerberg". If you regularly pay attention to the cultural goings-on in Silicon Valley, you will no doubt have heard of the "Learn to Code" movement.
You should also get used to the idea that at any time you may need to learn a new framework or language, and that you'll have to fight for a job if you don't have formalised credentials. To recap, most of the barriers that once blocked me and may block you from learning are psychological. You would never compare yourself to a native Spanish speaker within weeks of learning the language, nor would you consider yourself less capable than them. Developers are expected to learn fast, with little guidance and little more incentive than the faint rattle of the pink slip guillotine.
The line between learning to code and getting paid to code as a profession is not an easy one to cross. I started learning to code at the age of 26 (I'm 33 now); well after I finished university and also while I had a full time job. I think if you haven't started learning to code yet, you've labelled yourself as someone who is "not a developer". Who knows.
it already seems that one day building machine learning algorithms will also be automated with AIs building AIs. Since I tracked my effort for the better part of a year, I was able to determine that I probably spent ~300 hours learning to code in the last 10 months. So, starting in February, I set myself on a year of learning focused on learning to code from scratch, with a focus on learning to make web applications. It's like learning to paint, anyone can learn in an hour but no one hangs that shit on their walls.