• Home
  • News and Blog
  • About
PanoVista.co

Notes From TSAM Boston 2015

11/24/2015

0 Comments

 
It was thoroughly enjoyable chairing the Performance Measurement and Risk track at TSAM Boston event last week, and we heard many positive comments about the event and track from the attendees. Here are a few observations from the day.

We started the Performance track discussing the recent Gartner report on the new Business Intelligence platforms here and the potential impacts for performance professionals to add additional value to the business here.

Data quality and exception management continue to be critically important to performance professionals. Panelist shared their approaches, and many of the panelists and attendees also commented on the reality of having multiple performance systems to support the portfolio manager's preferences, and challenges of managing and ensuring accuracy (and explaining pricing differences) on those platforms. 

Alternative Investments are a growing part of almost every portfolio. One speaker noted that Alt's don't follow a normal distribution curve, so measures such as standard deviation and mean variance are not meaningful compared to kurtosis and other measures. The speaker also commented that manager selection and vintage year are more important than asset allocation.

Investment risk management is dynamic process and, as discussed with some very recent examples, can have a devastating impact on a firm if not followed. As with instilling a data culture, it's critical for firms to instill a risk culture "tone from the top" to provide the right incentives when necessary for the protection of the business vs. potentially earning a few basis points of excess return.

Please contact us to discuss these topics and how we can help you move up the business analytics value curve.
0 Comments

News - PanoVista.co Chairing Performance Track at TSAM Boston November 19, 2015

11/9/2015

0 Comments

 
We are pleased to announce that Bob Leaper has been invited back to Chair the Performance Measurement, Attribution and Investment Risk track at TSAM Boston on November 19, 2015. The goal of this track it for performance measurement professionals to deepen their network of peers in investment management. It is a truly neutral, unbiased platform, where the buy-side’s leading practitioners determine the agenda and deliver the program. TSAM itself is the leading event for developing efficient investment management companies, with additional tracks covering Data Management, Technology and Operations, Client Reporting and Communications, and Marketing & Sales communications. Find out more at http://www.tsam.net/boston.
0 Comments

The Pragmatist Asset Management CIO

10/30/2015

1 Comment

 
Global investment, wealth and hedge fund firms (together, asset managers) are under pressure. How can a Chief Information Officer at an asset management firm get to an agile-business future state quickly while wrestling with today’s challenges?

Fortunately, the answer is here now with a number of new technologies – many of which turn traditional assumptions on their head – that can enable the pragmatist CIO at asset management firms to be more agile, reduce costs, improve transparency, and compete more effectively.

Asset management challenges.

The pressure and challenges faced by asset managers are well documented. Compressed fees, lower cost alternatives (for example ETFs and robo advisors), constantly increasing regulatory reporting and risk management requirements, new and expanding data types, pressure to exceed mandates and goals and a lack of client loyalty mean that asset managers need a better way to manage the complex process of managing investments, attracting and servicing their clients.

Asset managers that are stuck in the traditional way of doing business risk lower returns than innovative competitors, losing AUM from existing clients, failing to win new clients, worsening margins, or even fines and public embarrassment. Yet few firms have the luxury of starting with a blank sheet of paper nor the budget to rip out and replace existing infrastructure.

Fortunately the technology landscape has changed drastically since the advent of NoSql data stores, Apache Hadoop’s release in 2011, and a rapidly expanding list of new database, integration, visualization and data science technologies.

A handful of visionary firms are just now going into production with these new technologies, and more importantly to the industry, new packaged solutions are now available to help cross the chasm for the pragmatist to complement existing systems and sources in a hybrid combination of traditional and new.

The traditional approach vs the new approach.

The traditional approach is that either a) data has to be standardized into a normalized relational model or canonical model in order for that data to be useful for the enterprise or b) that applications would remain isolated and business users would create their own aggregations in MS Excel. A lot of effort therefore goes into the ETL (Extract, Translate and Load) process to fit data from one source into the structure needed by a business user, destination application or data warehouse. Often that data is then rigid and not easily adapted for new uses. Alternatively, as in the case of Excel, data is too fragile and lacking in governance and controls.

The new approach is to leave source data in their raw form and combine data as needed for the consumer of that information, whether downstream applications, data scientist or business user. For the CIO the benefit is that data can be left in their silos and quickly blended on demand or replicated in their raw form into NoSQL databases or Hadoop Distributed File System (HFDS) to be combined with virtually any other data type, whether structured or unstructured.

There are many use cases for the benefits of this new approach, but a very simple example that we are seeing at asset managers is migrating large (1 Terabyte +) structured databases from traditional relational databases onto Hadoop HDFS and seeing dramatic improvements in query throughput. Furthermore, most new technologies are cloud-based, where the infrastructure can be scaled as needed to reduce capital investments, as well as reduce the burden on IT and shorten deployment time and risk.

Does the new approach remove the need for data quality and data governance?

No, not at all. It is equally important that data quality and provenance is clearly identified to avoid having a “big data dumping ground” and to ensure that asset management firms can meet transparency requirements from regulators, such as BCBS 239.

For example, data governance services such as Data3Sixty can read meta-data across the enterprise, capturing source and destination terms, then provide glossaries for primary and secondary uses for the data to support clarity and consistent use across the organization.

What are other examples of leading edge solutions using Cloud and Big Data technologies?

The financial services Hadoop solution from TickSmith supports full time-series, multi-asset class, and full entitlement support out of the box for quick deployments, whether cloud-based or on premise, and can scale from Terabytes to hundreds of Petabytes of data. Initial uses cases are for trading shops and exchanges, inclusive of complete trade histories with depth of book and concurrent unstructured event processing. Pending use cases include event-timed investment research and comprehensive Transaction Cost Analysis (TCA), among others.

Text analytics are one way to benefit from the constantly expanding news and web-sourced data. By ranking and scoring news items, the resulting sentiment analysis can be used to alert the trading desk, while also support back testing for trade signals for algorithms and quantitative managers. RavenPack is such a provider that uses NLP (natural language processing) and proprietary technology across a vast array of sources to provide historical and up to second sentiment alerts.

A growing area, even at traditional active management firms, are new product offerings that include smart Beta and liquid alternative funds and ETFs. Alpha Vee is a cloud based global equity research platform that can automate the creation of new dynamic strategies from research through fast back-testing and to ongoing portfolio management. The platform is used by asset managers and leading ETF providers to provide low cost Alpha.

What other possibilities are available to pioneering CIO’s?

Please contact us to discuss your firm’s situation and potential strategic planning through to specific projects and solutions that can help accelerate your move to the asset management data infrastructure of the future.
1 Comment

Data Owners and the New Data Realities in Financial Services

10/7/2015

1 Comment

 
Data owners -- Chief Data Officers, Chief Information Officers, Chief Operations Officers, Chief Marketing Officers or anyone ultimately responsible for “data” in an organization -- have many demands and challenges, yet have more opportunities and options that didn’t exist just a few years ago.

Collectively, this post will refer to all of these data owners as CDOs, recognizing that this is growing field and the title is not present in all organizations. The CDO role itself is designed to bridge a common gap between technology automation and the businesses ownership of data, and CDOs can often experience a wide difference in focus based on the data maturity, budgets and resources in their businesses.

First and foremost, CDOs are responsible for the quality of the data that the business – and their clients – rely upon. This is a hefty task in an industry where complete accuracy is paramount and the regulatory oversight burden continues to become stricter.

In some organizations with limited budgets, CDOs that we encounter are just trying to get the basics right. They are focused on identifying sources and uses of information, creating roles and responsibilities, and documenting procedures to provide a basic level of governance in their organizations. In one case the IT impact was as simple as locking down the hard drive housing the Excel spreadsheets that are used for corporate reporting.

As highlighted in PanoVista.co’s report here, most firms have or are in the process of establishing more mature levels of structured data cleansing and distribution. This may include centralizing pricing and security reference data, ensuring CRM (Client Relationship Management) databases have clean information, warehousing core operational data, or ensuring the marketing-> sales-> client onboarding-> client support process is seamless. These firms are typically able to support traditional BI (Business Intelligence) tools, or increasingly BYOBI (Bring Your Own BI) tools like Tableau or Qlik.

The firms that already have mature structured data infrastructures are now focused on governance tools that help enhance data transparency and workflow among the various data owners in an organization. For example, our partner Data3Sixty is a cloud-based data governance service that consolidates meta-data across diverse applications, provides glossaries from the sources to define primary enterprise definitions, assesses ongoing data quality metrics, and creates a social network of interested parties to monitor and correct data anomalies quickly.

It will be interesting to see how reference data services such as the recently announced Smartstream SPReD and emerging services such as Bloomberg Polarlake, Markit EDM Direct, RIMES Managed Data Services and OpenFinance will have on existing processes at many of these firms.

So far we’ve highlighted mostly to governance/compliance side of the CDO role. For many firms, this role is also responsible for innovation.

As noted, many firms have centralized components of their data with reference data tools and perhaps data warehouses. This is appropriate for data that is expensive to acquire and/or used by multiple systems, processes and reporting. Examples include prices, security reference, client data, positions and transactions, fundamental research data and performance returns.

However, many firms have more than one business line where it does not make sense to centralize data silos or warehouse all of the data, yet the wealth of information can potentially add significant insights into the health, risks and innovation opportunities for the business.

Using a federated or hybrid (centralized plus federated) model, we see firms increasingly either leaving the data in place or copying data into Hadoop or NoSql databases as-is. With either approach – as appropriately dictated based on the size, shape, and timeliness requirements – the data is blended when needed for reporting, advanced business intelligence or for Data Scientists to generate predictive and prescriptive analytics.

As a file system, Hadoop can store unstructured data that don’t fit into a relational model, such as internal research documents, emails and videos next to web and third party sourced data. Thought leading firms are now incorporating web-sourced text analytics and social predictive analytics into the investment research and trading process to identify alpha-generating signals.

Additionally, we are seeing firms enhancing research database performance using Hadoop HFDS (Hadoop Distributed File System), in-memory NoSql (Not Only SQL) databases and MPP (massively parallel processing) on commodity hardware to drastically improve query times compared to managing large RDBMS (Relational Database Management Systems).

New applications are increasingly running in the cloud. For example, the previously mentioned Data3Sixty runs on Azure. Our partner Alpha Vee runs on AWS (Amazon Web Services), where their service accelerates the investment product creation process with lightning fast global equity research, modeling, portfolio construction and management services. ETF providers in particular are drawn to the ability to create differentiated smart Beta and liquid alternative products quickly and cost effectively.

For retail oriented firms like mutual funds, ETFs, wealth management, insurance, and brokerage, the CDO needs to support the CMO teams’ need for combining internal and external demographic information to support segmentation, SEO and SMO (Search Engine Optimization, Social Media Optimization) with the latter requiring real time social feeds into predictive analytics for micro-segmenting prospects and the messages they receive. Our partner KeyInsite and their team of Data Scientist On-Demand are seeing increasing interest in the SMO space in addition to web-sourced investment research and sentiment analysis.

With most new applications available as cloud services, the set-up, deployment and ongoing total cost of ownership is a fraction of what it was just a few years ago using traditional technologies, along with maturing cloud information security models that can meet or exceed what many internal systems currently provide.
​

Ultimately, each business has its own journey and appetite for building a mature data infrastructure that can support governance, compliance and innovation. Fortunately, there are new resourcing models, open source tools, and deployment options that can help CDOs add real strategic value. 
1 Comment

Cloud and Analytics Innovations in Financial Services

9/8/2015

2 Comments

 
This article was initially published in bobsguide.com

In Geoffrey A. Moore’s seminal book Crossing the Chasm he describes a five technology adopter lifecycles of innovators, early adopters, early majority, late majority and laggards. The “chasm” is the most difficult step of innovations transitioning between visionaries (early adopters) and pragmatists (early majority). In my experience at each of the cycles noted below, the transition over the chasm is where the most vocal resistance and calls of “hype” are usually heard. The posit for this article is that Cloud Computing is well into the early majority phase while “Big Data Analytics” is just now crossing the chasm in the Financial Services industry, and suggests how you might think about potential impacts and practical approaches.

As background, it may be interesting looking back at many of the technology innovation cycles the have impacted Financial Services over the last 30 plus years:

·       1980’s: The explosion of the Personal Computer, Ethernet, and introduction of car phones signaled the beginning of individual control of information and being location agnostic.

·       1990’s: Relational Databases (RDBMS), client-server architecture, electronic trading, flip phones and early adoption of the World Wide Web began changing core business and operating models and improved access to data.

·       2000’s: The early majority acceptance of the Internet as a business platform, complex event processing, Blackberries, laptops vs desktops, business intelligence tools, open source software and early adoption of Cloud Computing delivered unprecedented productivity improvements for people working literally anywhere in the world.

·       2010’s: The current cycle of mobile computing, social media, early adoption of Big Data Analytics and early/late majority acceptance of Cloud Computing/virtualization is changing the way that we interact with each other on a personal level as well as with our employees, employers, customers and business providers. The visionary Internet of Things (IoT) promises to change the way we interact with just about everything else.

Cloud Computing

Marc Benioff of Salesforce.com was the first person really beating the drum regarding the power of Cloud Computing. More than just hosting software as an ASP – often with a terminal emulator supporting fat client applications – the Cloud was, and is, all about functionality and interoperability with variable pricing and minimal IT support needed. In theory a business user can put down a credit card and have immediate access to a fully supported platform to support a specific business function. Should they need additional functionality, the user can access that quickly from the Cloud ecosystem of other providers to meet their specific needs. The reality is that there is usually some conversion and implementation needed, but Cloud computing makes running a business vastly easier that installing and integrating on-premises systems.

For example, PanoVista.co’s partners offer solutions such as data governance, global equity research and portfolio construction, all asset class performance attribution, and collateral management delivered via Amazon Web Services (AWS) and Microsoft Azure clouds. Clients benefit by being productive within days or weeks rather than months or years, and our partners can quickly develop new functionality and keep all clients current with drastically lower support costs.

Big Data Analytics

Big Data is a term that references the 3 V’s: Volume, Velocity and Variety. Whereas structured data in Financial Services often contains very large data sets and real time pricing and transaction data, it’s really the huge Volume and Variety of unstructured data that adds complexity that traditional relational databases cannot support. This is where new - often open source – Big Data technologies and techniques come into play. Hadoop, NoSQL, and text search are all technologies that can scale horizontally (using distributed servers) across internal and Cloud-based structured and unstructured data sources to feed analytical languages (such as R and Python) and business intelligence applications to bring new insights into business.

Big Data Analytics is an area that has not yet crossed the chasm from the visionaries to the pragmatists in the Financial Services industry.  For examples of how early adopters are using these technologies, visit www.panovista.co/blog.

Security

One common theme of all of the innovation cycles noted here is that as our access to information and productivity has steadily increased, so too have the information security challenges. The perception of many firms is that on premise data is more secure than either private or public Cloud providers. However, recent information breaches by companies either not keeping their applications current or not encrypting their data in place, along with the great strides made by Cloud providers, is starting to change that perception.

At the recent AWS Summit in New York in July, Nate Sammons, Principal Architect at NASDAQ, provided a case study of their data security and encryption approach. He noted that encryption slows down processing speed, about 25% in their case; this performance drag is one explanation of why many internal IT teams might not enable encryption. However, one advantage of AWS Elastic Compute Cloud (EC2) is that NASDAQ can dynamically scale up processing as needed to overcome the encryption overhead across the millions of daily trades processed, as well as to support the predictive analytics they perform using a variety of open-source technologies.

What does this mean to the Financial Services industry?

As the industry continues to change, Cloud Computing and advanced Big Data analytics will have a growing positive impact on streamlining existing and creating new business models. However, it is possible to start with smaller projects by looking for key decision points that may lead to incremental change. Some questions to ask yourself and your business include:

·       Can our existing technologies be moved to the Cloud for a positive ROI? For example, is a core platform overdue for an upgrade along with the supporting hardware and software, and is a Cloud option available? If the core platform is due for replacement, what is the true TCO (total cost of ownership) of competing on premise vs. Cloud offerings?

·       Are we competitive enough? Do we have control of our data or do we have gaps in data ownership and quality? Do our teams have the right tools to streamline and automate their daily work? Are we really getting -– and using -- the best information possible to manage our investments and make important business decisions? Are we using data visualization effectively and pro-actively, and is the underlying information feeding our Business Intelligence tools correct and validated?

·       Are we responsive enough? Can we uncover new opportunities by mining our own data? When new client or business opportunities arise, can we react fast enough or do our internal processes bog us down? Is our environment flexible enough to scale up and down as needed?

Of course, information security should be foremost in considering these questions and potential alternatives. A final note from the ASW Summit noted above is that many healthcare providers are adopting the Cloud – including the need to support HIPAA compliance.

In closing, the recommendation is to always think about how you can improve your business, either by making it more efficient, more accurate, more responsive and more competitive. You may find that Cloud Computing and potentially newer technologies will play a big part in achieving your goals.

2 Comments

Tales from the Cutting Edge: Big Data Use Cases in Financial Services

7/28/2015

4 Comments

 
Though not yet mainstream, we are seeing thought leading Financial Services firms using Big Data technologies and techniques in at least four main areas: Marketing, Security, Enterprise-wide Business Intelligence, and Investment Research.

Big Data is a term that references the 3 V’s: Volume, Velocity and Variety. Whereas structured data in Financial Services often contains very large data sets and real time pricing and transaction data, it’s really the huge Volume and Variety of unstructured data that adds complexity that traditional relational databases cannot support. This is where new - often open source – Big Data technologies and techniques come into play.

PanoVista.co recently conducted a short survey to Financial Services firms that indicates that while structured internal data management is maturing with active projects planned or underway, the state of unstructured internal and external data is less mature with fewer projects—even though there is an awareness that change is needed. So, what are the practical use cases for Big Data that thought leading firms are engaged in?

Marketing

In many ways, social media has been the biggest driver of the explosion of Big Data, and that is true for retail oriented Financial Services firms such as banks, insurance companies, mutual funds and wealth management firms. Marketers are creating highly segmented campaigns and using predictive analytics to nurture leads to increase conversion rates online and via social media optimization, or to pass “marketing qualified leads” to the sales team to confirm if the leads are “sales qualified leads” in their pipeline.

Security

The ability to search large volumes of data looking for inconsistencies has a natural benefit for evaluating system logs looking for security breaches or curious user behavior. Furthermore, the horizontal scalability of Big Data platforms can overcome the performance hit associated with database encryption, reducing potential financial and reputational risk of the theft of data left unencrypted in place, which has been a shockingly common occurrence over the last 24 months.

Enterprise-wide Business Intelligence

As noted above, Financial Services firms typically manage large data, and more advanced firms have centralized and put quality controls around their structured data. However, much of this data still resides in silos by business unit, and few firms have taken the step to fully integrate internal unstructured data. The Big Data approach is often called a Data Lake, leveraging Hadoop’s ability to read across structured RDBMS and unstructured data such as email, documents and video and treating all of the data as a common logical data platform. Business Intelligence and advanced data analytics can then be used to generate new insights into the inner workings of an organization to improve efficiency, reduce risks and provide better client servicing.

Investment Research

Top tier hedge funds, asset management firms and banks are the early adopters using Big Data in the Investment Research area. This is a little more controversial since most results today are anecdotal versus backed by formal research, however the anecdotes indicate that traders and portfolio managers are indeed gaining a competitive advantage by having broader insight into their investments, often before traditional data providers are aware of news or can update their feeds. Social media activity prompted the SEC to rule on April 2, 2013 that public companies can use social media outlets to disclose material non-public information. One area that is less controversial is that very large research databases have been proven to be managed and mined more effectively on Big Data platforms such as Hadoop and NoSQL than on traditional relational database platforms.

Considerations

Big Data is gaining rapid acceptance across most industries more quickly than in Financial Services. The technologies are evolving quickly, and often there is a mix of technologies that are needed for to support a given use case. These technologies are designed to scale horizontally across many commodity servers, so before investing in huge data farms it may be worth considering cloud deployments to support scaling only on-demand and to also facilitate mixing technologies as needed to support evolving use cases. As every data scientist will attest, knowing what questions to ask and plumbing the data are critical steps for good analytics. An Agile project approach is best suited to support the iteration that will likely be needed before locking down an operational process. Therefore, planning on the approach, tools, resources, and anticipated results and benefits should remain part of the business case, just as with traditional technologies.

4 Comments

Data Maturity in Financial Services

6/30/2015

0 Comments

 
Picture


As depicted in the Chart, above, the early results from our short survey to Financial Services firms show that structured internal data management is maturing with active projects planned or underway, while the state of unstructured internal and external data is less mature with fewer projects—even though there is an awareness that change is needed.  

The maturity of internal structured data is reflective of the growth of data management initiatives over the last ten years to control the cost of acquiring data and also meet the growing regulatory reporting mandates for data and investment transparency. Automating data governance is the final phase of the maturity curve, with many firms currently evaluating options in this space.

Big Data often references the 3 V’s: Volume, Velocity and Variety. Whereas Structured Data in Financial Services often contains very large data sets and real time pricing and transaction data, it’s really the huge Volume (such as log surveillance) and Variety of unstructured data that adds complexity to traditional relational databases and opportunities for those firms who master it.

Anecdotally, we have seen that only the “early adopter” top tier firms -- buy side, sell side and banks – have started to invest in Big Data projects to enhance their unstructured internal and unstructured external data, and that the majority of smaller firms are aware they need to do something yet have not begun or are not sure where to start. This is despite the often quoted saying that 80% of Enterprise data is unstructured, and the well-publicized rapid growth of social analytics.

Unstructured internal data often contains valuable business insight into investment research documents (street and internal), staff on-boarding, knowledge management and client communications. Thought leading firms in many industries are investing to harvest this internal information, yet it seems to be a lower priority for financial services firms, perhaps due to the current investments in structured data projects.

Unstructured data from the web and social media is where the 3 V’s of Big Data are gaining a lot of traction with early adopters within the Financial Services industry.

  • Marketers are creating highly segmented campaigns and using predictive analytics to increase conversion rates online and via social media optimization. 
  • Traders and portfolio managers are gaining broader insight into their investments, often before traditional data providers are aware of news or can update their feeds. 
  • Social media activity prompted the SEC to rule on April 2, 2013 that companies can use social media outlets to disclose material non-public information. 
  • The power of data science and predictive analytics working on vast data sets are being used to provide competitive advantage to investors.
  • Very large research databases can be managed and mined more effectively on Big Data platforms such as Hadoop and NoSQL than on traditional relational platforms.

It’s rewarding to be involved in such an exciting time of rapidly changing technologies.


0 Comments

Thoughts from TSAM NYC 2015

6/24/2015

0 Comments

 
I was extremely honored to be invited to chair the Performance Measurement and Risk track at TSAM NYC 2015 last week. Kudos to the organizers, panelist and attendees, as it was a very informative, interactive and enjoyable experience. Here are some thoughts and observations from discussions at the event:

·       The Performance Measurement teams are one of the few groups in an investment management firm that interacts with every other department from front-to-back office, prospects and clients, and increasingly with regulators. They often catch errors (trade, pricing, reference data and accounting) before anyone else.

·       Performance returns are the ultimate in story telling: A company’s strategy, research, Portfolio Management decisions, trader’s execution, client’s mandate expectations vs. benchmarks are all fully exposed. Data Visualization can help tell that story. Daily controls are key to ensure its accuracy.

·       Most Performance teams now report into Operations, but there is a growing trend to report into Risk Management. Reporting lines have been in flux for years due to the first bullet point, above.

·       Communication remains one of the biggest hurdles in resolving issues and ensuring alignment, which is exacerbated by email, Instant Messenger and text. It’s better to walk over and talk to someone if you can, or use a video chat if you can’t. The exception is to use email to document expectations with teams and vendors.

·       From a project standpoint, many firms have difficulty keeping senior management involved and engaged in long term projects (more than six months). Agile project management can help alleviate that -- if done correctly -- by delivering iterative results. Most firms engage in Proof of Concepts (POCs) when selecting a new vendor in order to fully understand functionality and potential gaps, but agree POCs are better left for a final vendor or two if a bake-off is needed.

·       Fixed Income Attribution is more commonly used for client communications, yet many front office FIA tools don’t match the official returns, creating a data management and operational challenge for many. This is a bigger challenge for multi-asset managers. (Please note that one of our Partners can help resolve this issue.)

·       Cleared Swaps have not removed clearing risk, only transformed CCP’s into systemically important risk. Furthermore, the lack of cross boarder coordination is most evident in that Dodd-Frank prohibits the US from providing liquidity to the CCP’s, which will put the pressure on the Bank of England and ECB should a shock occur again. This transformation of risk is somewhat analogous to the creation of Credit Default Swaps after the LTCM crisis, a “credit risk reduction” vehicle intended to reduce risk with the unintended consequence of adding fuel to the financial crisis fire in 2008.

·       At a round table discussion at the end of the day, it was noted how regulatory-driven bank balance sheet restrictions are adding volatility in the most “risk free” market of all: US Treasury’s. Traditional Capital Markets desks – those that remain -- are doling out liquidity based on relationships in order to keep their balance sheets in line; the slack is being taken up to some extent by hedge funds, insurance companies and even corporate treasuries. Again, what are the unintended consequences going to be?

·       While this track had little discussion regarding Big Data, we had a number of very good conversations with people in the Data Management and Sales & Marketing tracks.





0 Comments

News - PanoVista.co Chairing Performance Track at TSAM NYC, June 18, 2015

5/19/2015

0 Comments

 
We are pleased to announce that Bob Leaper has been invited to Chair the Performance Measurement, Attribution and Investment Risk track at TSAM NYC on June 18, 2015. The goal of this track it for performance measurement professionals to deepen their network of peers in investment management. It is a truly neutral, unbiased platform, where the buy-side’s leading practitioners determine the agenda and deliver the program. TSAM itself is the leading event for developing efficient investment management companies, with additional tracks covering Data Management, Technology and Operations, Client Reporting and Communications, and Marketing & Sales communications. FInd out more at http://www.tsam.net/newyork.
0 Comments

News - PanoVista.co at SecOps USA, June 1-2

5/19/2015

0 Comments

 
We are please to announce that Bob Leaper has been invited to moderate a panel discussion at FTF's SecOps USA conference on June 1-2, 2015. The panel will be on IT Operations: Transitioning from Legacy to New Systems. The panelists will be Timothy Brodeur from Neuberger Berman, and Kristin Soule from Boston Advisors. The session will focus on the operational impact and lessons learned of adopting new technologies while keeping the business running. Find more information at http://www.ftfnews.com/event/sec-ops-2015/#.VVtTGPlViko.
0 Comments
<<Previous

    Author

    Bob Leaper is passionate about advances in technology and new business models.

    Archives

    November 2015
    October 2015
    September 2015
    July 2015
    June 2015
    May 2015
    March 2015

    Categories

    All

    RSS Feed

Proudly powered by Weebly