Bruno Dupire at our offices in Paris for a candid discussion about the world of finance in general, the status of quantitative finance and research in particular, and his views on a variety of developments set to shape the industry. The Dupire equation has since become a standard tool in the industry and has been used to price trillions of dollars of options over the years. He was awarded the Lifetime Achievement Award by Risk magazine in for his pioneering work in local volatility modelling. What is it that Bloomberg Quantitative Researchers typically do? BD: We serve the needs of a broad community of over , users with incredibly varied needs. We cover derivatives, machine learning ML , portfolio construction, pricing of illiquid assets, electronic trading, visualisation methods, election prediction and much more
|Genre:||Health and Food|
|Published (Last):||8 June 2012|
|PDF File Size:||12.74 Mb|
|ePub File Size:||3.91 Mb|
|Price:||Free* [*Free Regsitration Required]|
Bruno Dupire at our offices in Paris for a candid discusion about the world of finance in general, the status of quantitative finance and research in particular, and his views on a variety of developments set to shape the industry.
The Dupire equation has since become a standard tool in the industry and has been used to price trillions of dollars of options over the years. He was awarded the Lifetime Achievement Award by Risk magazine in for his pioneering work in local volatility modelling. What is it that Bloomberg Quantitative Researchers typically do? BD: We serve the needs of a broad community of over , users with incredibly varied needs.
We cover derivatives, machine learning ML , portfolio construction, pricing of illiquid assets, electronic trading, visualisation methods, election prediction and much more… We collaborate with numerous teams internally, but we also develop our own initiatives: for instance we have originated an ambitious project — a quant platform with innovative tools for visualisation and interaction.
It enables the Bloomberg user to create elaborate studies. It is a recognised intellectual forum for the community. CFM: What do you regard as the most discernible shifts in the focus of quantitative research within the industry? What do you see as the next frontier in quantitative research? BD: Over the past years we have witnessed a shift from the sell side to the buy side. From eliminating risk premium for the risk neutral pricing of derivatives, to seeking risk premium for investments.
From theory to data. From stochastics to statistics. Indeed, over the past five years the quant community has massively embraced ML, AI and the use of alternative data sets. At Bloomberg we deploy a massive effort in data science: we now employ hundreds of engineers and we are very active on the communication side as well, together with the heads of ML and of Data Science.
BD: My own involvement with AI predates my last 30 years in finance. It was probably a first. CFM: You mentioned the advent and large-scale uptake of machine learning and big data as key developments that have the potential of upending the industry. Do you share this view?
BD: Some say it is the future, others say it still needs to prove its relevance in finance. In my view, ML in finance is here to stay, but it will not solve everything. Currently the hype is intense, so intense that many feel compelled to pepper their speeches with phrases such as AI, ML, big data, predictive analytics, deep strategies, which is all often a varnish to hide a void.
Finance has always tried to relate the available current information to future behaviour in order to improve investment or risk management decisions and ML is the approach of choice to mechanically establish these links. ML relies on three pillars: data, algorithms and computing power. There is a galore of alternative data in finance and the algorithms and computing power have improved substantially.
It is thus natural that finance has embraced ML. Some aspects of finance are similar to physics, with experiments that can be repeated, other aspects are more grounded on game theory, with a circle of anticipating the anticipations.
In the first category we can find option pricing. Computing one option price as a function of its parameters and of the model parameter may be complicated but learning the whole pricing function can be easier once enough examples have been presented.
On the game theory side one can find trading and investment. It is impossible to repeat an experiment exactly under the same conditions. Regularities such as performance of strategies according to the market regime can be observed, but there is no guarantee of their persistence.
How do you see managers positioning themselves in this era of an ever-growing amount of data and data providers? BD: Opportunities rotate quickly and one has to be nimble to identify and exploit them. Many people will be late to the game but the providers of alternative data will definitely be busy for quite some time.
Examples of alternative data are text, satellite images, supply chain, Environmental, Social, and Governance ESG , vessel routes, credit cards, geolocation data, etc. But data is certainly not enough. When most market participants have access to the same data, what makes the difference is the ideas and the tools. CFM: Having access to all this new, and often very enticing data sources, is embraced as a revolutionary boon for data science and financial research. Do you harbour any reservations about the use and application of data?
BD: There is a belief, or illusion, that everything can emerge from the data itself, just let the data speak. But data also benefits from rules. Machines can learn from examples but they certainly can benefit from explanations and guidance. Another issue is the use of the data. It is very difficult to read causation from data. This question cannot be resolved by the data itself. This is how one can reveal the structure of dependencies without intervening.
CFM: You included ESG data as another area of focus for, and of alternative data providers, with new sources springing up all the time. Which, if any, of the common reservations about ESG data do you share?
BD: There is a pressure for transparency, a premium for reporting and there is a massive shift towards increased disclosure. Currently the best data is probably from G, governance independence, entrenchment, shareholder rights, diversity… , with some solid reporting especially from the early s. However, it is difficult to harmonise data across providers and cultures; for instance the notion of board independence differs according to each region.
Sustainability data depend partly on the sector and S, social, is especially patchy. A lot of effort is made to backfill and curate the data and push for better future disclosure. The data, even if not a complete set, is voluminous. There are time series for hundreds of fields for thousands of stocks.
My team is working on novel ways to visualise and navigate the data that makes it easier to reveal associations. What is your take on the ability of asset managers, especially quantitative, and systematic managers to respond to the ever-increasing ESG demands set by investors?
BD: Whichever composite factor can be extracted by inspecting past data, the future is bound to be different. ESG data will deeply affect investment decisions due to its ethical dimension and regulatory pressure. Numerous sovereign funds and asset managers express explicitly their preference for good ESG investments, sometimes acting as activists to change corporation practices. Many millennials want to invest only in good ESG stocks. This will generate massive asset migrations, which opens a whole array of opportunities.
Do you perceive any disenchantment from the industry towards quantitative strategies? BD: Data snooping, overfitting, or apophenia tendency to interpret noise as a pattern is indeed a huge pitfall. An algorithm that optimises over the past offers no guarantee of good future performance. The more complex, the more prone to overfitting.
The market is a machine that destroys signals. If a quant finds a signal that has worked in the past, he is likely to not be the only one and quickly competitors will try to grab this opportunity away from him. Risk premia are not a law of nature. Opportunities vanish quickly and the investor needs to be creative and to have efficient tools. However, there are some more resilient strategies, for instance based on behavioural principles.
Cognitive biases are here to stay: even to be aware of your own cognitive biases does not prevent you from following them. This means that over-reaction, disposition and endowment effects, conjunction fallacy, remorse aversion, anchoring, herding and reaction to sunk costs will not disappear.
CFM: What do you see as the key risks, or pitfalls for quantitative investment managers? BD: As mentioned earlier, a major issue is overfitting. Beyond that, a pervasive problem, whether it be in biology, social sciences or quantitative finance, is a clash of culture between the domain expert and the data scientist, with relevance being the collateral damage.
Similarly, using convolution nets to link returns to characteristics is perilous. The portfolio manager and his data scientist have to both make an effort to create a compatible conceptual platform. CFM: Do you have any pet peeves about the industry? BD: A dimension of quantitative finance that I find sorely missing is what financial engineering was supposed to address: use techniques to bring solutions to different economic agents.
Whether it be in derivative product design or in asset allocation, it seems that the needs of the individual agents are somewhat disregarded. As an example, what matters for a retiree is not if midcap pharmaceuticals in Asia will outperform the market but rather that she will not over live her savings. Her risk is her own longevity, not the performance of the value factor.
The asset allocator should definitely adapt to the investor and his or her personal benchmark. CFM: Any interesting projects you and your team are focussing on at the moment? BD: Beyond its daily tasks and its modelling role, my group explores many different directions: perceptual tests, colour theory, hand movements to input parameters, assistive technology. Moreover we have open sourced bqplot, our graphical library. It is important to give back to the community; the best researchers do not like to do just one thing and they want to have a purpose.
Disclaimer The views and opinions expressed in this interview are those of Dr. Dupire and do not necessarily reflect the official policy or position of either cfm or bloomberg lp or any of their affiliates.
The information provided herein is general information only and does not constitute investment or other advice. Any statements regarding market events, future events or other similar statements constitute only subjective views, are based upon expectations or beliefs, involve inherent risks and uncertainties and should therefore not be relied on. Future evidence and actual results could differ materially from those set forth, contemplated by or underlying these statements. In light of these risks and uncertainties, there can be no assurance that these statements are or will prove to be accurate or complete in any way.
Bruno Dupire - Breakeven Volatility
Bruno Dupire (Past Speaker)
- BUSINESS ETHICS ANDREW GHILLYER PDF
- FINANCIAL MANAGEMENT 11TH EDITION BY TITMAN KEOWN MARTIN PDF
- INDESIT WIDL126 MANUAL PDF
- ILMU PELUANG DAN STATISTIKA UNTUK INSINYUR PDF
- ENCHIRIDION OF INDULGENCES PDF
- EORIS ESSENCE RPG PDF
- KOSMETYKA STOSOWANA JOANNA DYLEWSKA-GRZELAKOWSKA PDF
- ASTM B308 PDF
- ABRAZOLO GEOMETRIA PDF