• Home
  • News and Blog
  • About
PanoVista.co

The Pragmatist Asset Management CIO

10/30/2015

1 Comment

 
Global investment, wealth and hedge fund firms (together, asset managers) are under pressure. How can a Chief Information Officer at an asset management firm get to an agile-business future state quickly while wrestling with today’s challenges?

Fortunately, the answer is here now with a number of new technologies – many of which turn traditional assumptions on their head – that can enable the pragmatist CIO at asset management firms to be more agile, reduce costs, improve transparency, and compete more effectively.

Asset management challenges.

The pressure and challenges faced by asset managers are well documented. Compressed fees, lower cost alternatives (for example ETFs and robo advisors), constantly increasing regulatory reporting and risk management requirements, new and expanding data types, pressure to exceed mandates and goals and a lack of client loyalty mean that asset managers need a better way to manage the complex process of managing investments, attracting and servicing their clients.

Asset managers that are stuck in the traditional way of doing business risk lower returns than innovative competitors, losing AUM from existing clients, failing to win new clients, worsening margins, or even fines and public embarrassment. Yet few firms have the luxury of starting with a blank sheet of paper nor the budget to rip out and replace existing infrastructure.

Fortunately the technology landscape has changed drastically since the advent of NoSql data stores, Apache Hadoop’s release in 2011, and a rapidly expanding list of new database, integration, visualization and data science technologies.

A handful of visionary firms are just now going into production with these new technologies, and more importantly to the industry, new packaged solutions are now available to help cross the chasm for the pragmatist to complement existing systems and sources in a hybrid combination of traditional and new.

The traditional approach vs the new approach.

The traditional approach is that either a) data has to be standardized into a normalized relational model or canonical model in order for that data to be useful for the enterprise or b) that applications would remain isolated and business users would create their own aggregations in MS Excel. A lot of effort therefore goes into the ETL (Extract, Translate and Load) process to fit data from one source into the structure needed by a business user, destination application or data warehouse. Often that data is then rigid and not easily adapted for new uses. Alternatively, as in the case of Excel, data is too fragile and lacking in governance and controls.

The new approach is to leave source data in their raw form and combine data as needed for the consumer of that information, whether downstream applications, data scientist or business user. For the CIO the benefit is that data can be left in their silos and quickly blended on demand or replicated in their raw form into NoSQL databases or Hadoop Distributed File System (HFDS) to be combined with virtually any other data type, whether structured or unstructured.

There are many use cases for the benefits of this new approach, but a very simple example that we are seeing at asset managers is migrating large (1 Terabyte +) structured databases from traditional relational databases onto Hadoop HDFS and seeing dramatic improvements in query throughput. Furthermore, most new technologies are cloud-based, where the infrastructure can be scaled as needed to reduce capital investments, as well as reduce the burden on IT and shorten deployment time and risk.

Does the new approach remove the need for data quality and data governance?

No, not at all. It is equally important that data quality and provenance is clearly identified to avoid having a “big data dumping ground” and to ensure that asset management firms can meet transparency requirements from regulators, such as BCBS 239.

For example, data governance services such as Data3Sixty can read meta-data across the enterprise, capturing source and destination terms, then provide glossaries for primary and secondary uses for the data to support clarity and consistent use across the organization.

What are other examples of leading edge solutions using Cloud and Big Data technologies?

The financial services Hadoop solution from TickSmith supports full time-series, multi-asset class, and full entitlement support out of the box for quick deployments, whether cloud-based or on premise, and can scale from Terabytes to hundreds of Petabytes of data. Initial uses cases are for trading shops and exchanges, inclusive of complete trade histories with depth of book and concurrent unstructured event processing. Pending use cases include event-timed investment research and comprehensive Transaction Cost Analysis (TCA), among others.

Text analytics are one way to benefit from the constantly expanding news and web-sourced data. By ranking and scoring news items, the resulting sentiment analysis can be used to alert the trading desk, while also support back testing for trade signals for algorithms and quantitative managers. RavenPack is such a provider that uses NLP (natural language processing) and proprietary technology across a vast array of sources to provide historical and up to second sentiment alerts.

A growing area, even at traditional active management firms, are new product offerings that include smart Beta and liquid alternative funds and ETFs. Alpha Vee is a cloud based global equity research platform that can automate the creation of new dynamic strategies from research through fast back-testing and to ongoing portfolio management. The platform is used by asset managers and leading ETF providers to provide low cost Alpha.

What other possibilities are available to pioneering CIO’s?

Please contact us to discuss your firm’s situation and potential strategic planning through to specific projects and solutions that can help accelerate your move to the asset management data infrastructure of the future.
1 Comment

Data Owners and the New Data Realities in Financial Services

10/7/2015

1 Comment

 
Data owners -- Chief Data Officers, Chief Information Officers, Chief Operations Officers, Chief Marketing Officers or anyone ultimately responsible for “data” in an organization -- have many demands and challenges, yet have more opportunities and options that didn’t exist just a few years ago.

Collectively, this post will refer to all of these data owners as CDOs, recognizing that this is growing field and the title is not present in all organizations. The CDO role itself is designed to bridge a common gap between technology automation and the businesses ownership of data, and CDOs can often experience a wide difference in focus based on the data maturity, budgets and resources in their businesses.

First and foremost, CDOs are responsible for the quality of the data that the business – and their clients – rely upon. This is a hefty task in an industry where complete accuracy is paramount and the regulatory oversight burden continues to become stricter.

In some organizations with limited budgets, CDOs that we encounter are just trying to get the basics right. They are focused on identifying sources and uses of information, creating roles and responsibilities, and documenting procedures to provide a basic level of governance in their organizations. In one case the IT impact was as simple as locking down the hard drive housing the Excel spreadsheets that are used for corporate reporting.

As highlighted in PanoVista.co’s report here, most firms have or are in the process of establishing more mature levels of structured data cleansing and distribution. This may include centralizing pricing and security reference data, ensuring CRM (Client Relationship Management) databases have clean information, warehousing core operational data, or ensuring the marketing-> sales-> client onboarding-> client support process is seamless. These firms are typically able to support traditional BI (Business Intelligence) tools, or increasingly BYOBI (Bring Your Own BI) tools like Tableau or Qlik.

The firms that already have mature structured data infrastructures are now focused on governance tools that help enhance data transparency and workflow among the various data owners in an organization. For example, our partner Data3Sixty is a cloud-based data governance service that consolidates meta-data across diverse applications, provides glossaries from the sources to define primary enterprise definitions, assesses ongoing data quality metrics, and creates a social network of interested parties to monitor and correct data anomalies quickly.

It will be interesting to see how reference data services such as the recently announced Smartstream SPReD and emerging services such as Bloomberg Polarlake, Markit EDM Direct, RIMES Managed Data Services and OpenFinance will have on existing processes at many of these firms.

So far we’ve highlighted mostly to governance/compliance side of the CDO role. For many firms, this role is also responsible for innovation.

As noted, many firms have centralized components of their data with reference data tools and perhaps data warehouses. This is appropriate for data that is expensive to acquire and/or used by multiple systems, processes and reporting. Examples include prices, security reference, client data, positions and transactions, fundamental research data and performance returns.

However, many firms have more than one business line where it does not make sense to centralize data silos or warehouse all of the data, yet the wealth of information can potentially add significant insights into the health, risks and innovation opportunities for the business.

Using a federated or hybrid (centralized plus federated) model, we see firms increasingly either leaving the data in place or copying data into Hadoop or NoSql databases as-is. With either approach – as appropriately dictated based on the size, shape, and timeliness requirements – the data is blended when needed for reporting, advanced business intelligence or for Data Scientists to generate predictive and prescriptive analytics.

As a file system, Hadoop can store unstructured data that don’t fit into a relational model, such as internal research documents, emails and videos next to web and third party sourced data. Thought leading firms are now incorporating web-sourced text analytics and social predictive analytics into the investment research and trading process to identify alpha-generating signals.

Additionally, we are seeing firms enhancing research database performance using Hadoop HFDS (Hadoop Distributed File System), in-memory NoSql (Not Only SQL) databases and MPP (massively parallel processing) on commodity hardware to drastically improve query times compared to managing large RDBMS (Relational Database Management Systems).

New applications are increasingly running in the cloud. For example, the previously mentioned Data3Sixty runs on Azure. Our partner Alpha Vee runs on AWS (Amazon Web Services), where their service accelerates the investment product creation process with lightning fast global equity research, modeling, portfolio construction and management services. ETF providers in particular are drawn to the ability to create differentiated smart Beta and liquid alternative products quickly and cost effectively.

For retail oriented firms like mutual funds, ETFs, wealth management, insurance, and brokerage, the CDO needs to support the CMO teams’ need for combining internal and external demographic information to support segmentation, SEO and SMO (Search Engine Optimization, Social Media Optimization) with the latter requiring real time social feeds into predictive analytics for micro-segmenting prospects and the messages they receive. Our partner KeyInsite and their team of Data Scientist On-Demand are seeing increasing interest in the SMO space in addition to web-sourced investment research and sentiment analysis.

With most new applications available as cloud services, the set-up, deployment and ongoing total cost of ownership is a fraction of what it was just a few years ago using traditional technologies, along with maturing cloud information security models that can meet or exceed what many internal systems currently provide.
​

Ultimately, each business has its own journey and appetite for building a mature data infrastructure that can support governance, compliance and innovation. Fortunately, there are new resourcing models, open source tools, and deployment options that can help CDOs add real strategic value. 
1 Comment

    Author

    Bob Leaper is passionate about advances in technology and new business models.

    Archives

    November 2015
    October 2015
    September 2015
    July 2015
    June 2015
    May 2015
    March 2015

    Categories

    All

    RSS Feed

Proudly powered by Weebly