In April 2023 I was honored to be able to speak at the LLMs in Production virtual conference put on by the MLOps Community. You can find the talk below:
In the talk I go through a couple of use-cases at Anzen where we’ve made use of large language models to solve for our business needs. Anzen is still a very small company and we’re quite constrainted on engineering resources, so I focus on how large language models have allowed us to bootstrap these features with a very small amount of time and data to work with. Compared to just a few years ago when implementing an ML-powered feature often meant training your own model from scratch or something close to it, fine-tuning and off-the-shelf products built on LLMs can allow businesses to implement these features almost as quickly as any other software feature.