Many organizations have a hard time making decisions about adopting new technologies. There are two prominent camps. The early adopters will jump at any new technology and push towards integrating it or replacing last week's new tech. Then, you have the careful crowd that just doesn't want to rock the boat. As long as everything works they are not interested in switching to something better. Between those two extremes of risk taking and risk aversion are the rest of us, trying to get by and take advantage of new development, but not at the cost of bringing the whole system down. So, what is a rational process for adopting new technologies?
There are several principles that will serve you well. Newer technologies, even if demonstrably better than incumbent alternatives, take time to reach maturity. The more complicated and crucial — the longer it takes to reach maturity. For example, a new library for parsing text may be fine to switch to after playing with it a little bit and verifying it works well. A totally new distributed NoSQL database is a different story and will probably require several years until you should put your company's fate there. The crucial element then is timing.
If a new technology has been battle tested long enough, has an active community and demonstrates clear benefits on a more mature alternative you may consider switching to it even for important projects. A good example is the Flask vs. Django debate. At this point, Flask has crossed the critical threshold as far as I'm concerned. When you do decide to adopt a new technology, you should do it gradually (maybe start with a non-critical small project) and have a plan B (probably stick with what you have) in case you discover unforeseen issues.
new tech, new technology investment, early adopters, new technologies, adopting new tech