Fortunately, the answer is here now with a number of new technologies – many of which turn traditional assumptions on their head – that can enable the pragmatist CIO at asset management firms to be more agile, reduce costs, improve transparency, and compete more effectively.
Asset management challenges.
The pressure and challenges faced by asset managers are well documented. Compressed fees, lower cost alternatives (for example ETFs and robo advisors), constantly increasing regulatory reporting and risk management requirements, new and expanding data types, pressure to exceed mandates and goals and a lack of client loyalty mean that asset managers need a better way to manage the complex process of managing investments, attracting and servicing their clients.
Asset managers that are stuck in the traditional way of doing business risk lower returns than innovative competitors, losing AUM from existing clients, failing to win new clients, worsening margins, or even fines and public embarrassment. Yet few firms have the luxury of starting with a blank sheet of paper nor the budget to rip out and replace existing infrastructure.
Fortunately the technology landscape has changed drastically since the advent of NoSql data stores, Apache Hadoop’s release in 2011, and a rapidly expanding list of new database, integration, visualization and data science technologies.
A handful of visionary firms are just now going into production with these new technologies, and more importantly to the industry, new packaged solutions are now available to help cross the chasm for the pragmatist to complement existing systems and sources in a hybrid combination of traditional and new.
The traditional approach vs the new approach.
The traditional approach is that either a) data has to be standardized into a normalized relational model or canonical model in order for that data to be useful for the enterprise or b) that applications would remain isolated and business users would create their own aggregations in MS Excel. A lot of effort therefore goes into the ETL (Extract, Translate and Load) process to fit data from one source into the structure needed by a business user, destination application or data warehouse. Often that data is then rigid and not easily adapted for new uses. Alternatively, as in the case of Excel, data is too fragile and lacking in governance and controls.
The new approach is to leave source data in their raw form and combine data as needed for the consumer of that information, whether downstream applications, data scientist or business user. For the CIO the benefit is that data can be left in their silos and quickly blended on demand or replicated in their raw form into NoSQL databases or Hadoop Distributed File System (HFDS) to be combined with virtually any other data type, whether structured or unstructured.
There are many use cases for the benefits of this new approach, but a very simple example that we are seeing at asset managers is migrating large (1 Terabyte +) structured databases from traditional relational databases onto Hadoop HDFS and seeing dramatic improvements in query throughput. Furthermore, most new technologies are cloud-based, where the infrastructure can be scaled as needed to reduce capital investments, as well as reduce the burden on IT and shorten deployment time and risk.
Does the new approach remove the need for data quality and data governance?
No, not at all. It is equally important that data quality and provenance is clearly identified to avoid having a “big data dumping ground” and to ensure that asset management firms can meet transparency requirements from regulators, such as BCBS 239.
For example, data governance services such as Data3Sixty can read meta-data across the enterprise, capturing source and destination terms, then provide glossaries for primary and secondary uses for the data to support clarity and consistent use across the organization.
What are other examples of leading edge solutions using Cloud and Big Data technologies?
The financial services Hadoop solution from TickSmith supports full time-series, multi-asset class, and full entitlement support out of the box for quick deployments, whether cloud-based or on premise, and can scale from Terabytes to hundreds of Petabytes of data. Initial uses cases are for trading shops and exchanges, inclusive of complete trade histories with depth of book and concurrent unstructured event processing. Pending use cases include event-timed investment research and comprehensive Transaction Cost Analysis (TCA), among others.
Text analytics are one way to benefit from the constantly expanding news and web-sourced data. By ranking and scoring news items, the resulting sentiment analysis can be used to alert the trading desk, while also support back testing for trade signals for algorithms and quantitative managers. RavenPack is such a provider that uses NLP (natural language processing) and proprietary technology across a vast array of sources to provide historical and up to second sentiment alerts.
A growing area, even at traditional active management firms, are new product offerings that include smart Beta and liquid alternative funds and ETFs. Alpha Vee is a cloud based global equity research platform that can automate the creation of new dynamic strategies from research through fast back-testing and to ongoing portfolio management. The platform is used by asset managers and leading ETF providers to provide low cost Alpha.
What other possibilities are available to pioneering CIO’s?
Please contact us to discuss your firm’s situation and potential strategic planning through to specific projects and solutions that can help accelerate your move to the asset management data infrastructure of the future.