Potential Applications Of Generative AI In Quant Trading

Image

Bin Ren of SigTech on democratizing quant trading

Hiten Patel

23 min read

I think the future is humans augmenting AI. I want to be seen as the bridge that bridges the AI brain to the real markets
Bin Ren, CEO, SigTech

In the latest episode of Innovators' Exchange, Hiten Patel speaks with Bin Ren, the CEO and co-founder of SigTech, a leading provider of quant technologies. The discussion centers around democratizing access to quant trading strategies, the surge of retail investing, and the profound implications of generative AI (Gen AI) in the financial markets.

Bin unveils the story behind SigTech's mission to accelerate the idea-to-market process in capital markets. Their focus on reducing the lifecycle of ideas from months to seconds empowers traders and portfolio managers, democratizing the landscape for both professionals and retail traders. The company's cutting-edge technology, featuring Gen AI, enables quick generation, testing, and deployment of trading ideas, contributing to a more efficient and accessible market.

Key themes explored in this episode include:

  • Democratizing access to quant trading, enhancing market efficiency, and the strategic infrastructure and technology rebuild undertaken by SigTech. The narrative unfolds further, touching on the challenges faced in serving diverse clients, the implications of democratized access, and the evolution of humans augmenting AI capabilities.
  • The episode also delves into SigTech's voice-based applications, shaping the future of market discussions. Bin shares insights into the changing nature of work with the rise of large language models and envisions a future where humans augment AI capabilities. The conversation explores challenges faced by SigTech in offering technology to various clients and how they strategically provide API services, allowing clients to tailor features to their needs.
  • Bin extends a shout-out to Kiet Tran’s Stealth-Startup, focused on data licensing models for AI. The startup aims to revolutionize data accessibility in a world where AI models are the licensee.

This episode is part of Innovators’ Exchange, a series that explores the financial infrastructure and technology landscape. Tune in for a captivating exploration of AI's transformative potential in financial markets, touching on key themes and opportunities for both professionals and retail investors.

Subscribe for more on: Apple Podcasts | Spotify | Google

Hiten Patel: Hi. Thank you for joining us today. It's Hiten Patel from Oliver Wyman, Global Lead of our Financial Infrastructure Technology and services platform. Thank you for listening and I'm delighted to welcome today Bin Ren. And I'm going to let Bin Ren, introduce himself, and elaborate a little on background if you can. Bin, tell everyone a bit more about how you've got to where you've got to today.

Bin Ren: Hey Hiten, such a pleasure to be here. So, hi, everybody. I'm Bin, the founder and CEO of SigTech. So SigTech is a technology company that specialises in capital markets. So before I talk about my company and what we do, you know, let me quickly introduce myself and how I got here. So I was born in China, I'm Chinese. I did my undergrad in electrical engineering and computer science in China and then went to Cambridge University in England to do my Ph.D. So I had the privilege of working on a project called Xen, which was a virtual machine monitor at that time, this was from 2003 to 2007. So the cool thing about Xen was it was the one piece of core technology that ended up in the Amazon, AWS, you know, cloud service.

So the entire of AWS was actually built on what we wrote. So Xen was at the time 100 times faster than any competing software out there, and it was opensource, totally free. So that's one very cool thing I did early on. And that got me into large scale computing and introduced me to finance.

Hiten: So I wanted to set that scene because we're going to cover some pretty interesting topics today. We will obviously take a bit of time for the company, but given all the hype out there at the moment on Generative AI and what's going on, it's always nice to have a bit of a practitioner, or I could be underselling you as a practitioner, in that space and I wanted to make sure we reflected on that.

But let's kick off a little bit on SigTech. Just talk to me a little bit more about what it does, what is the problem it is solving?

Bin: Yeah so SigTech, we specialise in capital markets and really we focus on one thing, which is to speed up the idea to market, right? If you think about the lifecycle of ideas for a typical trader or portfolio manager, their day to day is all about ideas, right? Idea generation implements the idea, it is tested and if it's good, deploy the idea, deploy capital behind the idea and generate P&Ls. So a lifecycle of ideas and it's becoming increasingly important to really make the lifecycle as short as possible, because opportunities come and go.

So we have developed the technologies over the last ten years to really help to bring ideas to market in seconds instead of instead of months. Right? It used to take months. Now it can take, actually with generative A.I., it now takes seconds. So that's what we do. And we believe that by helping everybody, not just professionals, but actually, enthusiasts, even retail traders, to take the idea and let them see the results in seconds.

We allow more information to be priced into the market more quickly and we will allow more participants in the market. Therefore, we are playing a role in making the market more efficient.

Hiten: So that's awesome. I think when you've described this to me in the past, that the picture I've had in my head was you like democratising the access to quant trading. I don't know whether that's a fair reflection, but it'd be great if you could talk a little bit around where you started to hone this capability versus the clients and the customers who are benefiting and using what you're bringing to market today.

Bin: Yeah, that's a great question Hiten. And so where we started was actually from the very top end of the market right? Because, you know, SigTech was, you know, the Sig, the S-I-G part of the name actually stands for Systematic Investment Group. It was the name of my group at Brevan Howard, one of the world's top hedge funds.

So I was the Chief Investment Officer of SIG at Brevan, and we were running quant funds using the technologies we built in-house. And we made a decision in 2019 to focus on monetising infrastructure and technology rebuild because we really believed that we had the best, best product at the time and probably still do and therefore we want to focus on where our edges are, right?

And so that's where we come from. So we come from a very institutional, you know, battle tested professional grade kind of technology. But since then, in the last four years, we essentially had pulled a lot of resources to make it more accessible to more and more people, a wider and wider audience. So when we started it required really, really the most sophisticated users, right?

People who understand market, understand Python, understand the programming, understand back testing, understand, you know overfitting and underfitting, all sorts of things. It's very, very hard core, right? But today Hiten, I'm delighted to tell you today, with the help of a large language model, what we have enabled is that anyone with any idea of the market can say it in English and we can generate the Python code using our AI models on the fly, in real time, and show you the results.

Actually, one of the applications they're working on is actually not just text or text to code to get results. Actually it's voice. So in a few months when you and I have a Zoom call and talk about financial market, when you make a comment, about the how the yield curve has changed because of the CPI print or I make, some comments regarding how the stock market has behaved because of enthusiasm for AI companies, our voice is automatically translated into text and text is automatically translated into Python code. And the idea we talk about is automatically implemented and run and shown as we speak, that's the future we're getting to.

Hiten: I hope that's more effective than when Gmail shows me adverts for fake lawns after hearing my wife complain about my attempts at keeping the grass. But it's pretty, pretty powerful stuff. And I guess in my head, hearing you talk, I've got this image, as you described, of this incredibly powerful, sophisticated tool that was used by, trained practitioners now being made available to, you know, every day person on the street. Talk to me about some of the challenges around, cascading that that down that you've kind of wrestled with in recent times.

Bin: Yes. I think when we started we were a B2B enterprise sales focused fintech company. And we've come to learn that the main challenges we have faced, actually, because I can summarise in one sentence now, it is a hard learned lesson. We have many things that everybody wants, but not many people want everything.

Okay. So it's, it's one of those very awkward situations where we go to enterprise client and say we have this amazing end to end solution or platform. Take a look. They inevitably love a lot of it, but there's always something they want that we don't have because in this new market, I think people naturally expect any off the shelf product to meet their exact preparatory requirements.

It's a bit unfair, but that's that's how people react to any new product. Right so, we understood the implication of this. You know, we made a very strategic decision which is to unbundle all the functionalities inside our fully integrated platform and then offer these hundreds of functionalities through API services.

So we have all these API services running and then then we can go to anyone and any prospect and say, you know what, pick and choose the bits that you want and integrate them into your applications, into your workflows instead of being trapped in some sense inside our own platform. So that has been working out really well.

Hiten: And you use the word implications there, just picking up on that. I don't know if it's too early to tell, but in your mind, what are some of the implications of how financial markets behave or opportunities arise from having democratised access to some of these quant trading strategies, more people being able to test and deploy that?

What do you think happens in terms of how markets behave, the ability to generate alpha, have you thought much about that? It would be interesting to get a sense as to how this plays through.

Bin: Yeah, I think, you know, if we go back to the, the four stages of the lifecycle of idea right, idea generation, implementation, testing and deployment. So I think for the participants from the retail side of the world I think it has huge implications. For example on the idea generation, the truth is the number of ideas is probably first and foremost a function of the number of people trying to think how come up with ideas right. And frankly the retail people, maybe the quality of the ideas is a bit lower than professionals, but there are hundreds of thousands of them very active and in different forums on Reddit.

And they, they have plenty of ideas, right? There are plenty of ideas. And the way and once they have some idea, some ideas inevitably go viral and gain followers. And then that's why what we what we witnessed in during the pandemic, you know retail traders and especially, you know, meme stocks became a phenomenon.

It was not just some sort of a joke or some kind of, you know, minor thing. It actually had implications because it actually severely affected the performance of some of the world's largest hedge funds. Not only some of them got hurt pretty badly, but it also made the rest of them realise that without modelling retail investors inside their portfolios and taking into account the retail sentiment and the rich retail idea is going viral, they have a serious problem in terms of managing their risk and sizing their trades.

So I think the boundary between professional and retail is getting blurred, especially in today's world where, you know, the most popular application, some of the largest companies actually are all consumer companies.

Hiten: Yeah I think it's fascinating. I'm always I wouldn't say frustrated but there was there's always this view that hey, there's the rise of passive and the retail investor should remain passive or you know, try and stop people pay fees away. Feels like you're introducing this third way by for those who want to get themselves up the curve, who have an interest in markets, an idea of where Alpha could be generated, you know, here's some tools to go out and help you, you know, efficiently put on some of the trades and put them to work, which historically they probably wouldn’t have had access to.

So what kind of training do they need? Like do you provide that? Is that something they find out in the market? Like how do they figure out whether they can pick your piece of kit up and know how to use it or put it to work?

Bin: Yeah. So we realized that, you know, the idea generation bit stage of the lifecycle then the retail investors, they are doing it already right. They are discussing it on social media, they are already trying to come up with ideas. But it's the rest of the lifecycle, it's about how easy is it to turn those ideas into code, then test it and see the results.

What happens now is that it is extremely difficult. So you have all these retail enthusiasts talking about ideas, but actually they barely test it. They don't test it because they can't implement it. Therefore they don't even know how good or bad the idea is. So we want to help them to actually learn about markets and actually test the idea they come up with by making them simple, like super simple.

Hiten: To spell it out for me. What's an example of retail? What is the quant strategy that a retail investor may be able to set to knock up?

Bin: Great, for example, I think one thing could be what is it like to buy Tesla stocks one day before the quarterly earnings announcement and sell it two days later? Because if you look at the past, maybe for eight or 12 quarters, every single quarter's announcement that Tesla has sort of beat expectations and rallied. Right. So it's very natural for a retail investor who loves Tesla and Elon Musk to say what’s it like just to capitalize the jump during the earnings release.

Right. So a simple idea like that, buy before earnings sell it to two days later. You know, it's a simple idea, but how does the result look like? Now with the help of large language models, we can literally turn the sentence into Python code call in stick tag APIs, generate a back test, show you the performance in seconds.

You can see this result being embedded inside the conversation itself. Someone says it on the forum, this idea and then it triggers a bot. And they pick it up, generate the results and then I could return the result, and embed it into parts of the conversation. We can see that happening.

Hiten: And do you guys that also implement from there on or is it then you know you let the investor can then go find their way on whichever platforms to execute the trade?

Bin: I think what's going to happen then is people can pick up these results like a simple back testing and then once they log into their account as SigTech and then they will be able to set up connectivity with their retail brokers and therefore send the ultimate stack, the calculation of the strategy and send the trading orders to the brokers that way.

So that's how we see, you know, by leveraging the models and analytics and engine we built, frankly for professionals and put a large language model in front of it as a, you know, user interface. We are now able to enable multiple use cases, which was simply not possible before.

Hiten: Amazing, amazing. So, we've kind of gone there already on the large language models and generative AI. But if I could just zoom out a little bit and think about it, we're relatively early on in that journey. A lot of people scrambling around figuring out what are some of the opportunities, what are some of the threats, more broadly across the financial markets’ ecosystem, where else do you see changes?
Which bits most excite you? Which bits scare you?

Bin: Yeah, I think what's amazing about say GPT is that it has been around for a few years or several years actually. Right. And frankly, no one really paid much attention to it until when Chat GPT came out, which is quite interesting because GPT was actually available through API services. You can actually build, you know, applications calling those large language models last year, I mean, even the year before. But it didn't become a global phenomenon until the open AI team built a retail focused application on top of it, which is fascinating. You know, it became the most popular retail application ever launched in human history. And now, you know, within record time they had over 100 million users. Right. It's fascinating because it's like the most advanced AI Ttechnology. Nobody paid attention. Nobody was talking about it.

Hiten: You and your friends were though. There was a cult. There was a group, right?

Bin: I said a group. We were experimenting. But, you know, it was sort of like a secret, you know, kind of a nerdy, nerdy obsession more than anything else. But once they launched a retail app on top of it, which is chat, chat GPT it really took off. So, you know, in terms of like the implications of this is, I think the GPT, the large language models. I mean there are actually multiple sources still fascinating there are multiple aspects.

One is it's improving exponentially. Right. Which is, you know, the last time we had experience with exponential improvement was the Moore's Law. Right, the number of, computing power doubles every 18 months, but it kind of has plateaued. You know, it kind of hit you know, it hit a ceiling in the last couple of years. But the exponential rate of improvement in AI is much faster than that. So something like it doubles every six months. Right, just kind of mind boggling. That’s number one.

And number two is that, you know, people now realize that, you know, the GPT can pass, you know professional exams in almost two dozen fields, in the top 10%. Right, the GPT4 is a pretty decent lawyer, pretty decent accountant and pretty decent tax advisor you know like so it's actually quite scary. So the way I think about it is people say but I think today people naturally think it is a tool, just a very clever tool and it's getting even cleverer.

But I tend to, so people are thinking about using A.I. to augment humans right? We use this tool to augment our work, but I think a bit differently. I think actually that's not radical enough. I think the more radical view of what's going to happen is actually humans are going to become the tools that augment AIs. Humans are going to be the tools augmenting AIs. So it's the other way around. So let me give you this idea. This is already happening. Now. If you look at the large social media companies like Tik Tok and Facebook, you know, they have this giant machine learning algorithm. They've trained at running multiple data centres that does moderation, right? It does moderation. So that piece of content created uploaded some A.I. says, Oh, the content is good or bad, but it cannot moderate effectively every single piece of the content. So they have still have to employ, for example, TikTok employ 20,000 people, humans, to do full time moderation in Ireland.

Right. Let me ask you, Tik Tok moderation is done by a giant machine learning slash A.I. algo running data centres using gigawatts of energy and then 20 humans dealing with the piece of content that the AI cannot deal with? Who is augmenting who? Is AI augmenting humans? Or humans augmenting AI's? So I think the future is humans augmenting A.I.

So that's how I am positioning SigTech as a company. I want to tag the products and services that we offer and we build today to be the tools used by all these large language models whenever they want to try or implement or do something in a financial market. I want to be seen as the bridge that bridges the A.I. brain to the real markets. So that's where we are.

Hiten: That's quite the image. If does nothing to stem my constant flipping to being, you know, exhilarated/scared, when anyone asks that question. I was having my own moment, reflecting on, you know, what can we learn from these large language models and how they work and add value and just looking at the role as like a consultant, an advisor, and I was like, I'm just a function of all the meetings that I do and the people that I see and a bit of logic as to what are some of the patterns and what are the bits that resonate.

And you spit some stuff out and then I was like, oh, with that I'm missing all of that. Like knowledge around written content and books. I probably need to feed a bit more into that as well as suddenly thinking a little bit like of yourself as a large language model. Okay, I got the right number of sources. What can I compute?

What's my uniqueness? Right? And for many advisors in our profession, whether you're a consultant, banker, lawyer, I think the uniqueness is a lot of the bilateral interactions you benefit from, right? In situations you see. But it's weird that even you're thinking of yourself as a bit of a large language model. Just, you know, how do you keep your edge in that set up?

Bin: It's fascinating. Maybe, you know, maybe time to, if all of us to keep a good record of our personal, personal data like all your thoughts, diaries, meeting notes, and then you use it to fine tune a large language model version of you Hiten, who does 75% of your job so you can you know have more time with family and all go on holidays in the Bahamas.

Hiten: But jokes aside. Right. It is interesting. We spend a lot of time in financial data companies. And one of the big things is what's commoditized data versus what's proprietary data. And a lot of these companies are incredibly successful and valuable from the proprietary data they have or mixing that with the commoditized data and data that's out there and readily available commoditized, super quickly.

And you raise a good point that right on a personal level, have we had that right level of discipline, which seems like we've allowed everything to get commoditized and be given away and what is that? What are those golden pieces that you do want to retain and keep? But anyway, let's park that there. I guess I wanted to pivot a little bit.

You were getting there on the personal side. One of the things I always like to ask is shed a little bit more light into your non-work life. Like talking a little bit about interest or a hobby or something that you're passionate about and how that's helped fuel what you do day to day in the professional world.

Bin: Right. Thank you Hiten. I never actually thought growing up, I never thought of finance as a job, as a dream job. You know, my dream job growing up actually was to be a philosopher until my uncle told me, it's not a real job. But the funny thing is, I think if I think about the nature of the work done by humans, as we, you know, adopt large language models, I realized something is very interesting, which is, you know, up to now a big part of our, job is to be analytical, to be diagnostic and to solve problems.

Right? That's our job, our job is problem solving. So what happens when large language models can solve problems and give answers faster and better and cheaper than we do. So I feel like the nature of the job is changing from giving answers to asking good questions. Because if today I ask a good question, I ask the GPT4 a good question.

I get a good answer. If I ask a better question, I get the better answer. Does this mean that the nature of the work is less about answers but more about questions? Does this mean that now we can all go back and take a page from the philosophy book, and then maybe what The ancient Greeks did in Athens had some merit.

You know, they were not, you know, watching TikTok and Instagram, but actually debating ideas and asking the right question. Maybe that's one of those skills that we come to appreciate more.

Hiten: Bring back the philosopher.

Bin: Reinvent the look of logic. So in my spare time I do read a lot and do some sports, cycling to decompress. But in my position I realize that in this environment where everything is changing so fast, my job has changed. My job now is make money, save money, raise money all the time, all the time.

So that's the nature of my job. So that's something constantly on my mind.

Hiten: But doing that in a philosophical approach, I could see that being if you take your theory and apply it, I can see that being a top ten baby names in the next generation, bringing back Socrates and Plato and the rest of them, but when I think about what you've just raised, I've been thinking about actually should people become more, you know, bring back the polymath, just the way the education system narrows and funnels you like.

I was a mathematician, I studied maths at university. Probably don't use it that much anymore because you're not at the cutting edge anymore and actually realizing some of these machine models, once you get to quite a narrow specificity of field, that's probably where they may be most advantageous. And what's left to do, as you say, is join the dots and ask the questions.

But yeah, there's probably a lot for us to wax lyrical as we reflect on some of this going forward. My final point I always like to cover well with guests is, you know, passing on and shining the spotlight elsewhere. But I'd love to invite you to kind of mention an individual or a company.
Someone who is doing something pretty exciting that's captured your attention that you think the community is not yet fully caught onto yet and share your views there.

Bin: Yes. Oh, great. A very good friend of mine, his name is Kiet Tran. He used to run the IHS Markit in Asia, very senior guy. And now, that IHS Markit has been merged with S&P and he is actually now focusing on launching his next start-up and I do know the name of the start-up yet because it is in stealth mode but it's coming out in July so I want to give a shout out to Kiet. Lovely, great incredible guy.

And his start-up is going to focus on how the data licensing models should work in a world where AI will be the licensee. You know how he envisages a world where A.I. models in order to, will use data to train and analyze data. So how does A.I. automatically license data on demand?

How do you manage the digital rights? How do you unbundle data datasets so that AI can get you to pick and choose what you need? Right. It's a very interesting idea and business model. And I can't wait to see where his company is going to head. So big shout out to Kiet and good luck.

Hiten: Awesome Thank you Bin. Yeah very timely I think, hearing you describe that is definitely going to be a big wave of interest there. Well, look, I think we're against time. Bin, first of all, thank you very, very much for taking the time out with us and really enjoyed that conversation from democratised access to quant trading, the rise of retail investing, and then philosophising around quite how far some of this generative AI may go, exciting and scaring us in equal measures. Thank you for taking the time out and coming on the show.

Bin: Hiten, Thank you so much. Pleasure.

 This transcript has been edited for clarity. This episode was recorded in May 2023. 

    In the latest episode of Innovators' Exchange, Hiten Patel speaks with Bin Ren, the CEO and co-founder of SigTech, a leading provider of quant technologies. The discussion centers around democratizing access to quant trading strategies, the surge of retail investing, and the profound implications of generative AI (Gen AI) in the financial markets.

    Bin unveils the story behind SigTech's mission to accelerate the idea-to-market process in capital markets. Their focus on reducing the lifecycle of ideas from months to seconds empowers traders and portfolio managers, democratizing the landscape for both professionals and retail traders. The company's cutting-edge technology, featuring Gen AI, enables quick generation, testing, and deployment of trading ideas, contributing to a more efficient and accessible market.

    Key themes explored in this episode include:

    • Democratizing access to quant trading, enhancing market efficiency, and the strategic infrastructure and technology rebuild undertaken by SigTech. The narrative unfolds further, touching on the challenges faced in serving diverse clients, the implications of democratized access, and the evolution of humans augmenting AI capabilities.
    • The episode also delves into SigTech's voice-based applications, shaping the future of market discussions. Bin shares insights into the changing nature of work with the rise of large language models and envisions a future where humans augment AI capabilities. The conversation explores challenges faced by SigTech in offering technology to various clients and how they strategically provide API services, allowing clients to tailor features to their needs.
    • Bin extends a shout-out to Kiet Tran’s Stealth-Startup, focused on data licensing models for AI. The startup aims to revolutionize data accessibility in a world where AI models are the licensee.

    This episode is part of Innovators’ Exchange, a series that explores the financial infrastructure and technology landscape. Tune in for a captivating exploration of AI's transformative potential in financial markets, touching on key themes and opportunities for both professionals and retail investors.

    Subscribe for more on: Apple Podcasts | Spotify | Google

    Hiten Patel: Hi. Thank you for joining us today. It's Hiten Patel from Oliver Wyman, Global Lead of our Financial Infrastructure Technology and services platform. Thank you for listening and I'm delighted to welcome today Bin Ren. And I'm going to let Bin Ren, introduce himself, and elaborate a little on background if you can. Bin, tell everyone a bit more about how you've got to where you've got to today.

    Bin Ren: Hey Hiten, such a pleasure to be here. So, hi, everybody. I'm Bin, the founder and CEO of SigTech. So SigTech is a technology company that specialises in capital markets. So before I talk about my company and what we do, you know, let me quickly introduce myself and how I got here. So I was born in China, I'm Chinese. I did my undergrad in electrical engineering and computer science in China and then went to Cambridge University in England to do my Ph.D. So I had the privilege of working on a project called Xen, which was a virtual machine monitor at that time, this was from 2003 to 2007. So the cool thing about Xen was it was the one piece of core technology that ended up in the Amazon, AWS, you know, cloud service.

    So the entire of AWS was actually built on what we wrote. So Xen was at the time 100 times faster than any competing software out there, and it was opensource, totally free. So that's one very cool thing I did early on. And that got me into large scale computing and introduced me to finance.

    Hiten: So I wanted to set that scene because we're going to cover some pretty interesting topics today. We will obviously take a bit of time for the company, but given all the hype out there at the moment on Generative AI and what's going on, it's always nice to have a bit of a practitioner, or I could be underselling you as a practitioner, in that space and I wanted to make sure we reflected on that.

    But let's kick off a little bit on SigTech. Just talk to me a little bit more about what it does, what is the problem it is solving?

    Bin: Yeah so SigTech, we specialise in capital markets and really we focus on one thing, which is to speed up the idea to market, right? If you think about the lifecycle of ideas for a typical trader or portfolio manager, their day to day is all about ideas, right? Idea generation implements the idea, it is tested and if it's good, deploy the idea, deploy capital behind the idea and generate P&Ls. So a lifecycle of ideas and it's becoming increasingly important to really make the lifecycle as short as possible, because opportunities come and go.

    So we have developed the technologies over the last ten years to really help to bring ideas to market in seconds instead of instead of months. Right? It used to take months. Now it can take, actually with generative A.I., it now takes seconds. So that's what we do. And we believe that by helping everybody, not just professionals, but actually, enthusiasts, even retail traders, to take the idea and let them see the results in seconds.

    We allow more information to be priced into the market more quickly and we will allow more participants in the market. Therefore, we are playing a role in making the market more efficient.

    Hiten: So that's awesome. I think when you've described this to me in the past, that the picture I've had in my head was you like democratising the access to quant trading. I don't know whether that's a fair reflection, but it'd be great if you could talk a little bit around where you started to hone this capability versus the clients and the customers who are benefiting and using what you're bringing to market today.

    Bin: Yeah, that's a great question Hiten. And so where we started was actually from the very top end of the market right? Because, you know, SigTech was, you know, the Sig, the S-I-G part of the name actually stands for Systematic Investment Group. It was the name of my group at Brevan Howard, one of the world's top hedge funds.

    So I was the Chief Investment Officer of SIG at Brevan, and we were running quant funds using the technologies we built in-house. And we made a decision in 2019 to focus on monetising infrastructure and technology rebuild because we really believed that we had the best, best product at the time and probably still do and therefore we want to focus on where our edges are, right?

    And so that's where we come from. So we come from a very institutional, you know, battle tested professional grade kind of technology. But since then, in the last four years, we essentially had pulled a lot of resources to make it more accessible to more and more people, a wider and wider audience. So when we started it required really, really the most sophisticated users, right?

    People who understand market, understand Python, understand the programming, understand back testing, understand, you know overfitting and underfitting, all sorts of things. It's very, very hard core, right? But today Hiten, I'm delighted to tell you today, with the help of a large language model, what we have enabled is that anyone with any idea of the market can say it in English and we can generate the Python code using our AI models on the fly, in real time, and show you the results.

    Actually, one of the applications they're working on is actually not just text or text to code to get results. Actually it's voice. So in a few months when you and I have a Zoom call and talk about financial market, when you make a comment, about the how the yield curve has changed because of the CPI print or I make, some comments regarding how the stock market has behaved because of enthusiasm for AI companies, our voice is automatically translated into text and text is automatically translated into Python code. And the idea we talk about is automatically implemented and run and shown as we speak, that's the future we're getting to.

    Hiten: I hope that's more effective than when Gmail shows me adverts for fake lawns after hearing my wife complain about my attempts at keeping the grass. But it's pretty, pretty powerful stuff. And I guess in my head, hearing you talk, I've got this image, as you described, of this incredibly powerful, sophisticated tool that was used by, trained practitioners now being made available to, you know, every day person on the street. Talk to me about some of the challenges around, cascading that that down that you've kind of wrestled with in recent times.

    Bin: Yes. I think when we started we were a B2B enterprise sales focused fintech company. And we've come to learn that the main challenges we have faced, actually, because I can summarise in one sentence now, it is a hard learned lesson. We have many things that everybody wants, but not many people want everything.

    Okay. So it's, it's one of those very awkward situations where we go to enterprise client and say we have this amazing end to end solution or platform. Take a look. They inevitably love a lot of it, but there's always something they want that we don't have because in this new market, I think people naturally expect any off the shelf product to meet their exact preparatory requirements.

    It's a bit unfair, but that's that's how people react to any new product. Right so, we understood the implication of this. You know, we made a very strategic decision which is to unbundle all the functionalities inside our fully integrated platform and then offer these hundreds of functionalities through API services.

    So we have all these API services running and then then we can go to anyone and any prospect and say, you know what, pick and choose the bits that you want and integrate them into your applications, into your workflows instead of being trapped in some sense inside our own platform. So that has been working out really well.

    Hiten: And you use the word implications there, just picking up on that. I don't know if it's too early to tell, but in your mind, what are some of the implications of how financial markets behave or opportunities arise from having democratised access to some of these quant trading strategies, more people being able to test and deploy that?

    What do you think happens in terms of how markets behave, the ability to generate alpha, have you thought much about that? It would be interesting to get a sense as to how this plays through.

    Bin: Yeah, I think, you know, if we go back to the, the four stages of the lifecycle of idea right, idea generation, implementation, testing and deployment. So I think for the participants from the retail side of the world I think it has huge implications. For example on the idea generation, the truth is the number of ideas is probably first and foremost a function of the number of people trying to think how come up with ideas right. And frankly the retail people, maybe the quality of the ideas is a bit lower than professionals, but there are hundreds of thousands of them very active and in different forums on Reddit.

    And they, they have plenty of ideas, right? There are plenty of ideas. And the way and once they have some idea, some ideas inevitably go viral and gain followers. And then that's why what we what we witnessed in during the pandemic, you know retail traders and especially, you know, meme stocks became a phenomenon.

    It was not just some sort of a joke or some kind of, you know, minor thing. It actually had implications because it actually severely affected the performance of some of the world's largest hedge funds. Not only some of them got hurt pretty badly, but it also made the rest of them realise that without modelling retail investors inside their portfolios and taking into account the retail sentiment and the rich retail idea is going viral, they have a serious problem in terms of managing their risk and sizing their trades.

    So I think the boundary between professional and retail is getting blurred, especially in today's world where, you know, the most popular application, some of the largest companies actually are all consumer companies.

    Hiten: Yeah I think it's fascinating. I'm always I wouldn't say frustrated but there was there's always this view that hey, there's the rise of passive and the retail investor should remain passive or you know, try and stop people pay fees away. Feels like you're introducing this third way by for those who want to get themselves up the curve, who have an interest in markets, an idea of where Alpha could be generated, you know, here's some tools to go out and help you, you know, efficiently put on some of the trades and put them to work, which historically they probably wouldn’t have had access to.

    So what kind of training do they need? Like do you provide that? Is that something they find out in the market? Like how do they figure out whether they can pick your piece of kit up and know how to use it or put it to work?

    Bin: Yeah. So we realized that, you know, the idea generation bit stage of the lifecycle then the retail investors, they are doing it already right. They are discussing it on social media, they are already trying to come up with ideas. But it's the rest of the lifecycle, it's about how easy is it to turn those ideas into code, then test it and see the results.

    What happens now is that it is extremely difficult. So you have all these retail enthusiasts talking about ideas, but actually they barely test it. They don't test it because they can't implement it. Therefore they don't even know how good or bad the idea is. So we want to help them to actually learn about markets and actually test the idea they come up with by making them simple, like super simple.

    Hiten: To spell it out for me. What's an example of retail? What is the quant strategy that a retail investor may be able to set to knock up?

    Bin: Great, for example, I think one thing could be what is it like to buy Tesla stocks one day before the quarterly earnings announcement and sell it two days later? Because if you look at the past, maybe for eight or 12 quarters, every single quarter's announcement that Tesla has sort of beat expectations and rallied. Right. So it's very natural for a retail investor who loves Tesla and Elon Musk to say what’s it like just to capitalize the jump during the earnings release.

    Right. So a simple idea like that, buy before earnings sell it to two days later. You know, it's a simple idea, but how does the result look like? Now with the help of large language models, we can literally turn the sentence into Python code call in stick tag APIs, generate a back test, show you the performance in seconds.

    You can see this result being embedded inside the conversation itself. Someone says it on the forum, this idea and then it triggers a bot. And they pick it up, generate the results and then I could return the result, and embed it into parts of the conversation. We can see that happening.

    Hiten: And do you guys that also implement from there on or is it then you know you let the investor can then go find their way on whichever platforms to execute the trade?

    Bin: I think what's going to happen then is people can pick up these results like a simple back testing and then once they log into their account as SigTech and then they will be able to set up connectivity with their retail brokers and therefore send the ultimate stack, the calculation of the strategy and send the trading orders to the brokers that way.

    So that's how we see, you know, by leveraging the models and analytics and engine we built, frankly for professionals and put a large language model in front of it as a, you know, user interface. We are now able to enable multiple use cases, which was simply not possible before.

    Hiten: Amazing, amazing. So, we've kind of gone there already on the large language models and generative AI. But if I could just zoom out a little bit and think about it, we're relatively early on in that journey. A lot of people scrambling around figuring out what are some of the opportunities, what are some of the threats, more broadly across the financial markets’ ecosystem, where else do you see changes?
    Which bits most excite you? Which bits scare you?

    Bin: Yeah, I think what's amazing about say GPT is that it has been around for a few years or several years actually. Right. And frankly, no one really paid much attention to it until when Chat GPT came out, which is quite interesting because GPT was actually available through API services. You can actually build, you know, applications calling those large language models last year, I mean, even the year before. But it didn't become a global phenomenon until the open AI team built a retail focused application on top of it, which is fascinating. You know, it became the most popular retail application ever launched in human history. And now, you know, within record time they had over 100 million users. Right. It's fascinating because it's like the most advanced AI Ttechnology. Nobody paid attention. Nobody was talking about it.

    Hiten: You and your friends were though. There was a cult. There was a group, right?

    Bin: I said a group. We were experimenting. But, you know, it was sort of like a secret, you know, kind of a nerdy, nerdy obsession more than anything else. But once they launched a retail app on top of it, which is chat, chat GPT it really took off. So, you know, in terms of like the implications of this is, I think the GPT, the large language models. I mean there are actually multiple sources still fascinating there are multiple aspects.

    One is it's improving exponentially. Right. Which is, you know, the last time we had experience with exponential improvement was the Moore's Law. Right, the number of, computing power doubles every 18 months, but it kind of has plateaued. You know, it kind of hit you know, it hit a ceiling in the last couple of years. But the exponential rate of improvement in AI is much faster than that. So something like it doubles every six months. Right, just kind of mind boggling. That’s number one.

    And number two is that, you know, people now realize that, you know, the GPT can pass, you know professional exams in almost two dozen fields, in the top 10%. Right, the GPT4 is a pretty decent lawyer, pretty decent accountant and pretty decent tax advisor you know like so it's actually quite scary. So the way I think about it is people say but I think today people naturally think it is a tool, just a very clever tool and it's getting even cleverer.

    But I tend to, so people are thinking about using A.I. to augment humans right? We use this tool to augment our work, but I think a bit differently. I think actually that's not radical enough. I think the more radical view of what's going to happen is actually humans are going to become the tools that augment AIs. Humans are going to be the tools augmenting AIs. So it's the other way around. So let me give you this idea. This is already happening. Now. If you look at the large social media companies like Tik Tok and Facebook, you know, they have this giant machine learning algorithm. They've trained at running multiple data centres that does moderation, right? It does moderation. So that piece of content created uploaded some A.I. says, Oh, the content is good or bad, but it cannot moderate effectively every single piece of the content. So they have still have to employ, for example, TikTok employ 20,000 people, humans, to do full time moderation in Ireland.

    Right. Let me ask you, Tik Tok moderation is done by a giant machine learning slash A.I. algo running data centres using gigawatts of energy and then 20 humans dealing with the piece of content that the AI cannot deal with? Who is augmenting who? Is AI augmenting humans? Or humans augmenting AI's? So I think the future is humans augmenting A.I.

    So that's how I am positioning SigTech as a company. I want to tag the products and services that we offer and we build today to be the tools used by all these large language models whenever they want to try or implement or do something in a financial market. I want to be seen as the bridge that bridges the A.I. brain to the real markets. So that's where we are.

    Hiten: That's quite the image. If does nothing to stem my constant flipping to being, you know, exhilarated/scared, when anyone asks that question. I was having my own moment, reflecting on, you know, what can we learn from these large language models and how they work and add value and just looking at the role as like a consultant, an advisor, and I was like, I'm just a function of all the meetings that I do and the people that I see and a bit of logic as to what are some of the patterns and what are the bits that resonate.

    And you spit some stuff out and then I was like, oh, with that I'm missing all of that. Like knowledge around written content and books. I probably need to feed a bit more into that as well as suddenly thinking a little bit like of yourself as a large language model. Okay, I got the right number of sources. What can I compute?

    What's my uniqueness? Right? And for many advisors in our profession, whether you're a consultant, banker, lawyer, I think the uniqueness is a lot of the bilateral interactions you benefit from, right? In situations you see. But it's weird that even you're thinking of yourself as a bit of a large language model. Just, you know, how do you keep your edge in that set up?

    Bin: It's fascinating. Maybe, you know, maybe time to, if all of us to keep a good record of our personal, personal data like all your thoughts, diaries, meeting notes, and then you use it to fine tune a large language model version of you Hiten, who does 75% of your job so you can you know have more time with family and all go on holidays in the Bahamas.

    Hiten: But jokes aside. Right. It is interesting. We spend a lot of time in financial data companies. And one of the big things is what's commoditized data versus what's proprietary data. And a lot of these companies are incredibly successful and valuable from the proprietary data they have or mixing that with the commoditized data and data that's out there and readily available commoditized, super quickly.

    And you raise a good point that right on a personal level, have we had that right level of discipline, which seems like we've allowed everything to get commoditized and be given away and what is that? What are those golden pieces that you do want to retain and keep? But anyway, let's park that there. I guess I wanted to pivot a little bit.

    You were getting there on the personal side. One of the things I always like to ask is shed a little bit more light into your non-work life. Like talking a little bit about interest or a hobby or something that you're passionate about and how that's helped fuel what you do day to day in the professional world.

    Bin: Right. Thank you Hiten. I never actually thought growing up, I never thought of finance as a job, as a dream job. You know, my dream job growing up actually was to be a philosopher until my uncle told me, it's not a real job. But the funny thing is, I think if I think about the nature of the work done by humans, as we, you know, adopt large language models, I realized something is very interesting, which is, you know, up to now a big part of our, job is to be analytical, to be diagnostic and to solve problems.

    Right? That's our job, our job is problem solving. So what happens when large language models can solve problems and give answers faster and better and cheaper than we do. So I feel like the nature of the job is changing from giving answers to asking good questions. Because if today I ask a good question, I ask the GPT4 a good question.

    I get a good answer. If I ask a better question, I get the better answer. Does this mean that the nature of the work is less about answers but more about questions? Does this mean that now we can all go back and take a page from the philosophy book, and then maybe what The ancient Greeks did in Athens had some merit.

    You know, they were not, you know, watching TikTok and Instagram, but actually debating ideas and asking the right question. Maybe that's one of those skills that we come to appreciate more.

    Hiten: Bring back the philosopher.

    Bin: Reinvent the look of logic. So in my spare time I do read a lot and do some sports, cycling to decompress. But in my position I realize that in this environment where everything is changing so fast, my job has changed. My job now is make money, save money, raise money all the time, all the time.

    So that's the nature of my job. So that's something constantly on my mind.

    Hiten: But doing that in a philosophical approach, I could see that being if you take your theory and apply it, I can see that being a top ten baby names in the next generation, bringing back Socrates and Plato and the rest of them, but when I think about what you've just raised, I've been thinking about actually should people become more, you know, bring back the polymath, just the way the education system narrows and funnels you like.

    I was a mathematician, I studied maths at university. Probably don't use it that much anymore because you're not at the cutting edge anymore and actually realizing some of these machine models, once you get to quite a narrow specificity of field, that's probably where they may be most advantageous. And what's left to do, as you say, is join the dots and ask the questions.

    But yeah, there's probably a lot for us to wax lyrical as we reflect on some of this going forward. My final point I always like to cover well with guests is, you know, passing on and shining the spotlight elsewhere. But I'd love to invite you to kind of mention an individual or a company.
    Someone who is doing something pretty exciting that's captured your attention that you think the community is not yet fully caught onto yet and share your views there.

    Bin: Yes. Oh, great. A very good friend of mine, his name is Kiet Tran. He used to run the IHS Markit in Asia, very senior guy. And now, that IHS Markit has been merged with S&P and he is actually now focusing on launching his next start-up and I do know the name of the start-up yet because it is in stealth mode but it's coming out in July so I want to give a shout out to Kiet. Lovely, great incredible guy.

    And his start-up is going to focus on how the data licensing models should work in a world where AI will be the licensee. You know how he envisages a world where A.I. models in order to, will use data to train and analyze data. So how does A.I. automatically license data on demand?

    How do you manage the digital rights? How do you unbundle data datasets so that AI can get you to pick and choose what you need? Right. It's a very interesting idea and business model. And I can't wait to see where his company is going to head. So big shout out to Kiet and good luck.

    Hiten: Awesome Thank you Bin. Yeah very timely I think, hearing you describe that is definitely going to be a big wave of interest there. Well, look, I think we're against time. Bin, first of all, thank you very, very much for taking the time out with us and really enjoyed that conversation from democratised access to quant trading, the rise of retail investing, and then philosophising around quite how far some of this generative AI may go, exciting and scaring us in equal measures. Thank you for taking the time out and coming on the show.

    Bin: Hiten, Thank you so much. Pleasure.

     This transcript has been edited for clarity. This episode was recorded in May 2023. 

    In the latest episode of Innovators' Exchange, Hiten Patel speaks with Bin Ren, the CEO and co-founder of SigTech, a leading provider of quant technologies. The discussion centers around democratizing access to quant trading strategies, the surge of retail investing, and the profound implications of generative AI (Gen AI) in the financial markets.

    Bin unveils the story behind SigTech's mission to accelerate the idea-to-market process in capital markets. Their focus on reducing the lifecycle of ideas from months to seconds empowers traders and portfolio managers, democratizing the landscape for both professionals and retail traders. The company's cutting-edge technology, featuring Gen AI, enables quick generation, testing, and deployment of trading ideas, contributing to a more efficient and accessible market.

    Key themes explored in this episode include:

    • Democratizing access to quant trading, enhancing market efficiency, and the strategic infrastructure and technology rebuild undertaken by SigTech. The narrative unfolds further, touching on the challenges faced in serving diverse clients, the implications of democratized access, and the evolution of humans augmenting AI capabilities.
    • The episode also delves into SigTech's voice-based applications, shaping the future of market discussions. Bin shares insights into the changing nature of work with the rise of large language models and envisions a future where humans augment AI capabilities. The conversation explores challenges faced by SigTech in offering technology to various clients and how they strategically provide API services, allowing clients to tailor features to their needs.
    • Bin extends a shout-out to Kiet Tran’s Stealth-Startup, focused on data licensing models for AI. The startup aims to revolutionize data accessibility in a world where AI models are the licensee.

    This episode is part of Innovators’ Exchange, a series that explores the financial infrastructure and technology landscape. Tune in for a captivating exploration of AI's transformative potential in financial markets, touching on key themes and opportunities for both professionals and retail investors.

    Subscribe for more on: Apple Podcasts | Spotify | Google

    Hiten Patel: Hi. Thank you for joining us today. It's Hiten Patel from Oliver Wyman, Global Lead of our Financial Infrastructure Technology and services platform. Thank you for listening and I'm delighted to welcome today Bin Ren. And I'm going to let Bin Ren, introduce himself, and elaborate a little on background if you can. Bin, tell everyone a bit more about how you've got to where you've got to today.

    Bin Ren: Hey Hiten, such a pleasure to be here. So, hi, everybody. I'm Bin, the founder and CEO of SigTech. So SigTech is a technology company that specialises in capital markets. So before I talk about my company and what we do, you know, let me quickly introduce myself and how I got here. So I was born in China, I'm Chinese. I did my undergrad in electrical engineering and computer science in China and then went to Cambridge University in England to do my Ph.D. So I had the privilege of working on a project called Xen, which was a virtual machine monitor at that time, this was from 2003 to 2007. So the cool thing about Xen was it was the one piece of core technology that ended up in the Amazon, AWS, you know, cloud service.

    So the entire of AWS was actually built on what we wrote. So Xen was at the time 100 times faster than any competing software out there, and it was opensource, totally free. So that's one very cool thing I did early on. And that got me into large scale computing and introduced me to finance.

    Hiten: So I wanted to set that scene because we're going to cover some pretty interesting topics today. We will obviously take a bit of time for the company, but given all the hype out there at the moment on Generative AI and what's going on, it's always nice to have a bit of a practitioner, or I could be underselling you as a practitioner, in that space and I wanted to make sure we reflected on that.

    But let's kick off a little bit on SigTech. Just talk to me a little bit more about what it does, what is the problem it is solving?

    Bin: Yeah so SigTech, we specialise in capital markets and really we focus on one thing, which is to speed up the idea to market, right? If you think about the lifecycle of ideas for a typical trader or portfolio manager, their day to day is all about ideas, right? Idea generation implements the idea, it is tested and if it's good, deploy the idea, deploy capital behind the idea and generate P&Ls. So a lifecycle of ideas and it's becoming increasingly important to really make the lifecycle as short as possible, because opportunities come and go.

    So we have developed the technologies over the last ten years to really help to bring ideas to market in seconds instead of instead of months. Right? It used to take months. Now it can take, actually with generative A.I., it now takes seconds. So that's what we do. And we believe that by helping everybody, not just professionals, but actually, enthusiasts, even retail traders, to take the idea and let them see the results in seconds.

    We allow more information to be priced into the market more quickly and we will allow more participants in the market. Therefore, we are playing a role in making the market more efficient.

    Hiten: So that's awesome. I think when you've described this to me in the past, that the picture I've had in my head was you like democratising the access to quant trading. I don't know whether that's a fair reflection, but it'd be great if you could talk a little bit around where you started to hone this capability versus the clients and the customers who are benefiting and using what you're bringing to market today.

    Bin: Yeah, that's a great question Hiten. And so where we started was actually from the very top end of the market right? Because, you know, SigTech was, you know, the Sig, the S-I-G part of the name actually stands for Systematic Investment Group. It was the name of my group at Brevan Howard, one of the world's top hedge funds.

    So I was the Chief Investment Officer of SIG at Brevan, and we were running quant funds using the technologies we built in-house. And we made a decision in 2019 to focus on monetising infrastructure and technology rebuild because we really believed that we had the best, best product at the time and probably still do and therefore we want to focus on where our edges are, right?

    And so that's where we come from. So we come from a very institutional, you know, battle tested professional grade kind of technology. But since then, in the last four years, we essentially had pulled a lot of resources to make it more accessible to more and more people, a wider and wider audience. So when we started it required really, really the most sophisticated users, right?

    People who understand market, understand Python, understand the programming, understand back testing, understand, you know overfitting and underfitting, all sorts of things. It's very, very hard core, right? But today Hiten, I'm delighted to tell you today, with the help of a large language model, what we have enabled is that anyone with any idea of the market can say it in English and we can generate the Python code using our AI models on the fly, in real time, and show you the results.

    Actually, one of the applications they're working on is actually not just text or text to code to get results. Actually it's voice. So in a few months when you and I have a Zoom call and talk about financial market, when you make a comment, about the how the yield curve has changed because of the CPI print or I make, some comments regarding how the stock market has behaved because of enthusiasm for AI companies, our voice is automatically translated into text and text is automatically translated into Python code. And the idea we talk about is automatically implemented and run and shown as we speak, that's the future we're getting to.

    Hiten: I hope that's more effective than when Gmail shows me adverts for fake lawns after hearing my wife complain about my attempts at keeping the grass. But it's pretty, pretty powerful stuff. And I guess in my head, hearing you talk, I've got this image, as you described, of this incredibly powerful, sophisticated tool that was used by, trained practitioners now being made available to, you know, every day person on the street. Talk to me about some of the challenges around, cascading that that down that you've kind of wrestled with in recent times.

    Bin: Yes. I think when we started we were a B2B enterprise sales focused fintech company. And we've come to learn that the main challenges we have faced, actually, because I can summarise in one sentence now, it is a hard learned lesson. We have many things that everybody wants, but not many people want everything.

    Okay. So it's, it's one of those very awkward situations where we go to enterprise client and say we have this amazing end to end solution or platform. Take a look. They inevitably love a lot of it, but there's always something they want that we don't have because in this new market, I think people naturally expect any off the shelf product to meet their exact preparatory requirements.

    It's a bit unfair, but that's that's how people react to any new product. Right so, we understood the implication of this. You know, we made a very strategic decision which is to unbundle all the functionalities inside our fully integrated platform and then offer these hundreds of functionalities through API services.

    So we have all these API services running and then then we can go to anyone and any prospect and say, you know what, pick and choose the bits that you want and integrate them into your applications, into your workflows instead of being trapped in some sense inside our own platform. So that has been working out really well.

    Hiten: And you use the word implications there, just picking up on that. I don't know if it's too early to tell, but in your mind, what are some of the implications of how financial markets behave or opportunities arise from having democratised access to some of these quant trading strategies, more people being able to test and deploy that?

    What do you think happens in terms of how markets behave, the ability to generate alpha, have you thought much about that? It would be interesting to get a sense as to how this plays through.

    Bin: Yeah, I think, you know, if we go back to the, the four stages of the lifecycle of idea right, idea generation, implementation, testing and deployment. So I think for the participants from the retail side of the world I think it has huge implications. For example on the idea generation, the truth is the number of ideas is probably first and foremost a function of the number of people trying to think how come up with ideas right. And frankly the retail people, maybe the quality of the ideas is a bit lower than professionals, but there are hundreds of thousands of them very active and in different forums on Reddit.

    And they, they have plenty of ideas, right? There are plenty of ideas. And the way and once they have some idea, some ideas inevitably go viral and gain followers. And then that's why what we what we witnessed in during the pandemic, you know retail traders and especially, you know, meme stocks became a phenomenon.

    It was not just some sort of a joke or some kind of, you know, minor thing. It actually had implications because it actually severely affected the performance of some of the world's largest hedge funds. Not only some of them got hurt pretty badly, but it also made the rest of them realise that without modelling retail investors inside their portfolios and taking into account the retail sentiment and the rich retail idea is going viral, they have a serious problem in terms of managing their risk and sizing their trades.

    So I think the boundary between professional and retail is getting blurred, especially in today's world where, you know, the most popular application, some of the largest companies actually are all consumer companies.

    Hiten: Yeah I think it's fascinating. I'm always I wouldn't say frustrated but there was there's always this view that hey, there's the rise of passive and the retail investor should remain passive or you know, try and stop people pay fees away. Feels like you're introducing this third way by for those who want to get themselves up the curve, who have an interest in markets, an idea of where Alpha could be generated, you know, here's some tools to go out and help you, you know, efficiently put on some of the trades and put them to work, which historically they probably wouldn’t have had access to.

    So what kind of training do they need? Like do you provide that? Is that something they find out in the market? Like how do they figure out whether they can pick your piece of kit up and know how to use it or put it to work?

    Bin: Yeah. So we realized that, you know, the idea generation bit stage of the lifecycle then the retail investors, they are doing it already right. They are discussing it on social media, they are already trying to come up with ideas. But it's the rest of the lifecycle, it's about how easy is it to turn those ideas into code, then test it and see the results.

    What happens now is that it is extremely difficult. So you have all these retail enthusiasts talking about ideas, but actually they barely test it. They don't test it because they can't implement it. Therefore they don't even know how good or bad the idea is. So we want to help them to actually learn about markets and actually test the idea they come up with by making them simple, like super simple.

    Hiten: To spell it out for me. What's an example of retail? What is the quant strategy that a retail investor may be able to set to knock up?

    Bin: Great, for example, I think one thing could be what is it like to buy Tesla stocks one day before the quarterly earnings announcement and sell it two days later? Because if you look at the past, maybe for eight or 12 quarters, every single quarter's announcement that Tesla has sort of beat expectations and rallied. Right. So it's very natural for a retail investor who loves Tesla and Elon Musk to say what’s it like just to capitalize the jump during the earnings release.

    Right. So a simple idea like that, buy before earnings sell it to two days later. You know, it's a simple idea, but how does the result look like? Now with the help of large language models, we can literally turn the sentence into Python code call in stick tag APIs, generate a back test, show you the performance in seconds.

    You can see this result being embedded inside the conversation itself. Someone says it on the forum, this idea and then it triggers a bot. And they pick it up, generate the results and then I could return the result, and embed it into parts of the conversation. We can see that happening.

    Hiten: And do you guys that also implement from there on or is it then you know you let the investor can then go find their way on whichever platforms to execute the trade?

    Bin: I think what's going to happen then is people can pick up these results like a simple back testing and then once they log into their account as SigTech and then they will be able to set up connectivity with their retail brokers and therefore send the ultimate stack, the calculation of the strategy and send the trading orders to the brokers that way.

    So that's how we see, you know, by leveraging the models and analytics and engine we built, frankly for professionals and put a large language model in front of it as a, you know, user interface. We are now able to enable multiple use cases, which was simply not possible before.

    Hiten: Amazing, amazing. So, we've kind of gone there already on the large language models and generative AI. But if I could just zoom out a little bit and think about it, we're relatively early on in that journey. A lot of people scrambling around figuring out what are some of the opportunities, what are some of the threats, more broadly across the financial markets’ ecosystem, where else do you see changes?
    Which bits most excite you? Which bits scare you?

    Bin: Yeah, I think what's amazing about say GPT is that it has been around for a few years or several years actually. Right. And frankly, no one really paid much attention to it until when Chat GPT came out, which is quite interesting because GPT was actually available through API services. You can actually build, you know, applications calling those large language models last year, I mean, even the year before. But it didn't become a global phenomenon until the open AI team built a retail focused application on top of it, which is fascinating. You know, it became the most popular retail application ever launched in human history. And now, you know, within record time they had over 100 million users. Right. It's fascinating because it's like the most advanced AI Ttechnology. Nobody paid attention. Nobody was talking about it.

    Hiten: You and your friends were though. There was a cult. There was a group, right?

    Bin: I said a group. We were experimenting. But, you know, it was sort of like a secret, you know, kind of a nerdy, nerdy obsession more than anything else. But once they launched a retail app on top of it, which is chat, chat GPT it really took off. So, you know, in terms of like the implications of this is, I think the GPT, the large language models. I mean there are actually multiple sources still fascinating there are multiple aspects.

    One is it's improving exponentially. Right. Which is, you know, the last time we had experience with exponential improvement was the Moore's Law. Right, the number of, computing power doubles every 18 months, but it kind of has plateaued. You know, it kind of hit you know, it hit a ceiling in the last couple of years. But the exponential rate of improvement in AI is much faster than that. So something like it doubles every six months. Right, just kind of mind boggling. That’s number one.

    And number two is that, you know, people now realize that, you know, the GPT can pass, you know professional exams in almost two dozen fields, in the top 10%. Right, the GPT4 is a pretty decent lawyer, pretty decent accountant and pretty decent tax advisor you know like so it's actually quite scary. So the way I think about it is people say but I think today people naturally think it is a tool, just a very clever tool and it's getting even cleverer.

    But I tend to, so people are thinking about using A.I. to augment humans right? We use this tool to augment our work, but I think a bit differently. I think actually that's not radical enough. I think the more radical view of what's going to happen is actually humans are going to become the tools that augment AIs. Humans are going to be the tools augmenting AIs. So it's the other way around. So let me give you this idea. This is already happening. Now. If you look at the large social media companies like Tik Tok and Facebook, you know, they have this giant machine learning algorithm. They've trained at running multiple data centres that does moderation, right? It does moderation. So that piece of content created uploaded some A.I. says, Oh, the content is good or bad, but it cannot moderate effectively every single piece of the content. So they have still have to employ, for example, TikTok employ 20,000 people, humans, to do full time moderation in Ireland.

    Right. Let me ask you, Tik Tok moderation is done by a giant machine learning slash A.I. algo running data centres using gigawatts of energy and then 20 humans dealing with the piece of content that the AI cannot deal with? Who is augmenting who? Is AI augmenting humans? Or humans augmenting AI's? So I think the future is humans augmenting A.I.

    So that's how I am positioning SigTech as a company. I want to tag the products and services that we offer and we build today to be the tools used by all these large language models whenever they want to try or implement or do something in a financial market. I want to be seen as the bridge that bridges the A.I. brain to the real markets. So that's where we are.

    Hiten: That's quite the image. If does nothing to stem my constant flipping to being, you know, exhilarated/scared, when anyone asks that question. I was having my own moment, reflecting on, you know, what can we learn from these large language models and how they work and add value and just looking at the role as like a consultant, an advisor, and I was like, I'm just a function of all the meetings that I do and the people that I see and a bit of logic as to what are some of the patterns and what are the bits that resonate.

    And you spit some stuff out and then I was like, oh, with that I'm missing all of that. Like knowledge around written content and books. I probably need to feed a bit more into that as well as suddenly thinking a little bit like of yourself as a large language model. Okay, I got the right number of sources. What can I compute?

    What's my uniqueness? Right? And for many advisors in our profession, whether you're a consultant, banker, lawyer, I think the uniqueness is a lot of the bilateral interactions you benefit from, right? In situations you see. But it's weird that even you're thinking of yourself as a bit of a large language model. Just, you know, how do you keep your edge in that set up?

    Bin: It's fascinating. Maybe, you know, maybe time to, if all of us to keep a good record of our personal, personal data like all your thoughts, diaries, meeting notes, and then you use it to fine tune a large language model version of you Hiten, who does 75% of your job so you can you know have more time with family and all go on holidays in the Bahamas.

    Hiten: But jokes aside. Right. It is interesting. We spend a lot of time in financial data companies. And one of the big things is what's commoditized data versus what's proprietary data. And a lot of these companies are incredibly successful and valuable from the proprietary data they have or mixing that with the commoditized data and data that's out there and readily available commoditized, super quickly.

    And you raise a good point that right on a personal level, have we had that right level of discipline, which seems like we've allowed everything to get commoditized and be given away and what is that? What are those golden pieces that you do want to retain and keep? But anyway, let's park that there. I guess I wanted to pivot a little bit.

    You were getting there on the personal side. One of the things I always like to ask is shed a little bit more light into your non-work life. Like talking a little bit about interest or a hobby or something that you're passionate about and how that's helped fuel what you do day to day in the professional world.

    Bin: Right. Thank you Hiten. I never actually thought growing up, I never thought of finance as a job, as a dream job. You know, my dream job growing up actually was to be a philosopher until my uncle told me, it's not a real job. But the funny thing is, I think if I think about the nature of the work done by humans, as we, you know, adopt large language models, I realized something is very interesting, which is, you know, up to now a big part of our, job is to be analytical, to be diagnostic and to solve problems.

    Right? That's our job, our job is problem solving. So what happens when large language models can solve problems and give answers faster and better and cheaper than we do. So I feel like the nature of the job is changing from giving answers to asking good questions. Because if today I ask a good question, I ask the GPT4 a good question.

    I get a good answer. If I ask a better question, I get the better answer. Does this mean that the nature of the work is less about answers but more about questions? Does this mean that now we can all go back and take a page from the philosophy book, and then maybe what The ancient Greeks did in Athens had some merit.

    You know, they were not, you know, watching TikTok and Instagram, but actually debating ideas and asking the right question. Maybe that's one of those skills that we come to appreciate more.

    Hiten: Bring back the philosopher.

    Bin: Reinvent the look of logic. So in my spare time I do read a lot and do some sports, cycling to decompress. But in my position I realize that in this environment where everything is changing so fast, my job has changed. My job now is make money, save money, raise money all the time, all the time.

    So that's the nature of my job. So that's something constantly on my mind.

    Hiten: But doing that in a philosophical approach, I could see that being if you take your theory and apply it, I can see that being a top ten baby names in the next generation, bringing back Socrates and Plato and the rest of them, but when I think about what you've just raised, I've been thinking about actually should people become more, you know, bring back the polymath, just the way the education system narrows and funnels you like.

    I was a mathematician, I studied maths at university. Probably don't use it that much anymore because you're not at the cutting edge anymore and actually realizing some of these machine models, once you get to quite a narrow specificity of field, that's probably where they may be most advantageous. And what's left to do, as you say, is join the dots and ask the questions.

    But yeah, there's probably a lot for us to wax lyrical as we reflect on some of this going forward. My final point I always like to cover well with guests is, you know, passing on and shining the spotlight elsewhere. But I'd love to invite you to kind of mention an individual or a company.
    Someone who is doing something pretty exciting that's captured your attention that you think the community is not yet fully caught onto yet and share your views there.

    Bin: Yes. Oh, great. A very good friend of mine, his name is Kiet Tran. He used to run the IHS Markit in Asia, very senior guy. And now, that IHS Markit has been merged with S&P and he is actually now focusing on launching his next start-up and I do know the name of the start-up yet because it is in stealth mode but it's coming out in July so I want to give a shout out to Kiet. Lovely, great incredible guy.

    And his start-up is going to focus on how the data licensing models should work in a world where AI will be the licensee. You know how he envisages a world where A.I. models in order to, will use data to train and analyze data. So how does A.I. automatically license data on demand?

    How do you manage the digital rights? How do you unbundle data datasets so that AI can get you to pick and choose what you need? Right. It's a very interesting idea and business model. And I can't wait to see where his company is going to head. So big shout out to Kiet and good luck.

    Hiten: Awesome Thank you Bin. Yeah very timely I think, hearing you describe that is definitely going to be a big wave of interest there. Well, look, I think we're against time. Bin, first of all, thank you very, very much for taking the time out with us and really enjoyed that conversation from democratised access to quant trading, the rise of retail investing, and then philosophising around quite how far some of this generative AI may go, exciting and scaring us in equal measures. Thank you for taking the time out and coming on the show.

    Bin: Hiten, Thank you so much. Pleasure.

     This transcript has been edited for clarity. This episode was recorded in May 2023. 

Author