Marc Alvarez is the chief data officer of Mizuho Securities USA.
What changes do you expect to see in regards to financial technology, analytics, and machine learning in 2017?
We’re looking at an interesting year of new arrivals in the space. Financial firms are posting significantly improved results in 2016 and, I think, that’s going to provide a solid base for increased investment in these areas in the banking and capital markets space.
Expect to see blockchain and distributed ledgers move from experiment and evaluation into actual business applications in 2017. There’s always an adoption curve that needs to be traversed, but I think blockchain is going to start to see the light of day this year as it proves itself as not only a viable approach to solving legacy technology issues but also opens the door to new peer-to-peer business models. Adoption and deployment of vendor platforms likely is going to remain at a modest pace, but I think internal success stories are going to drive the demand for solid solutions at an accelerated pace. And hopefully, we’ve heard the last of the techno-hype about it all and can now look at bona fide use cases and success stories.
Machine learning has arrived and will continue to be the technology you never hear about. As firms become more and more adept at using machine learning to improve customer experience and better understand market dynamics they face, it will start to become part of the infrastructure. Expect vendors of CRM and other platforms to start to include it in their platforms so customers can not only perform specific functions but also analyze them statistically. That’s going to drive more and more profiling and research, so it will feed into business strategy – at which point it’s going to become very hush hush. Ironically, though, it will become part of standard operating processes.
Analytics of all types will be the big news story of 2017. The industry it putting up a lot of new applications and service models (think cloud computing) that change the landscape. Previously sophisticated quantitative analytics are now accessible as services over the internet with increasingly intuitive and easy to use interfaces. That means you’re going to see a lot more statistical analysis creeping into daily business operations – this is the new normal. We live in an increasingly quantitative world, so the demand for front-of-the curve analytic capabilities is going to accelerate – expect to see new firms offering an increasingly broad array of analytics capabilities online.
Coupled to that, you are going to hear a lot more from firms that specialize in sourcing and rendering data content processing in these new platforms – the act of assembling and validating data content for the purposes of statistical analysis remains a very difficult job. However, the demand for statistically sound analytics and modeling is going to drive growth in this area.
Look for a lot of new firms offering variants and combinations of the above in 2017. The business models aren’t yet keeping pace with the technology, but I think the new normal is to look for these capabilities as externally supplied services in order to benefit from time-to-market.