16th March 12:00-13:00 Queens 1.6
Come to Queens 1.6 at 12pm on March 16th for a talk by Bloomberg on machine learning and natural-language processing (with, of course, free food 🍕)!
The Bloomberg Terminal brings together real-time data on every market, breaking news, in-depth research, powerful analytics in one fully integrated solution. In the News product, we provide, our award-winning news coverage ensures our clients could get the information they need. While at the same time, putting a lot of effort into trying to avoid overloading them with excessive information.
With more than 2,700 in-house news professional, and 1000 external news sources, our clients are bombarded with massive volume of text every moment, and becoming less able to understand markets and effectively make decisions based on them.
In this talk, we will look into this issue and go over how Bloomberg scientists designed a system that is able to refine information from a massive amount of news stories. We will discuss the ML/NLP techniques that underpin it, and illustrate extraction results from recent news articles.
Finally, we will discuss what are the other potential applications of such a system.
Iat Chong Chan is a research scientist/software developer in Bloomberg Machine Learning Team. His interests mostly lie in the intersection of Computational Linguistics, Machine Learning, and High Performance Computing. He has been working on a scalable infrastructure to infer topics of social contents ingested to Bloomberg by statistical models, and a multi-documents summarisation system to extract the most important information from a text collection. Iat Chong also leads the NLP guild inside Bloomberg, to advocate the use of ML/NLP techniques for new business problems. Before he joined the company, he was a MSc student in Dept. of Computer Science at University of Oxford, supervised by Prof. Stephen Pulman and Yishu Miao, and worked on building a better input method on small hand-held devices by a novel Bayesian Network with Variational Inference.