A System of Agents brings Service-as-Software to life READ MORE
At our Fintech x AI PortCo Summit, Trey Holterman, co-founder and CEO of Tennr, shared a practical framework for thinking through exactly these types of challenges. Drawing on the team at Tennr’s expertise in leveraging AI to automate complex tasks, Trey’s framework empowers founders to evaluate and select high-value AI use cases.
01.17.2024 | By: Charles Moldow, Nico Stainfeld
The potential of Generative AI to transform financial services in the realm of startups is undeniable. With so much potential, founders often face challenges in identifying and prioritizing the most promising use-cases for their businesses.
At our Fintech x AI PortCo Summit, Trey Holterman, co-founder and CEO of Tennr, shared a practical framework for thinking through exactly these types of challenges. Drawing on the team at Tennr’s expertise in leveraging AI to automate complex tasks, Trey’s framework empowers founders to evaluate and select high-value AI use cases.
Trey emphasized finding high-potential AI applications requires focusing on business goals first.
It’s perfectly natural to jump directly into the AI brainstorm and enjoy the exciting rush of blue-sky ideation, but it’s essential to begin with the core problems you need to solve.
Don’t start by thinking, “How can I use AI?”
Instead think, “What are my core problems?”
Once the idea is grounded in accomplishing business objectives, the next decision in the framework is understanding if the use case is internal or external:
Internal workflows are the operations that support the business. These are processes that can be described clearly, with a defined beginning and end. Examples include: ingesting an intake form and updating your Salesforce CRM, replying to a complex query, or filling out forms.
External workflows are embedded, automated experiences between your clients and software. Examples include chatbots and recommendation engines.
With this framing, founders can begin assessing the complexity of their use cases, and whether or not it makes sense to build the function internally.
Trey suggests that founders approach analyzing internal workflows with the following thought process: First, consider if the process resides within one software system, like Salesforce or Concur. If so, can the provider build the desired AI functionality natively with their own developer tools and resources? In this “yes, no” scenario – the most common scenario Trey experiences with tech companies – you likely want to reach out and request the feature rather than build it from scratch:
“It’s no secret that a lot of SaaS companies are adding AI features […] It’s probably in their roadmap. They’re probably thinking about it already and it’s probably not worth your time building a feature that really should exist in their system.”
However, for highly-complex, domain-specific needs beyond a provider’s capabilities – something a provider won’t be including in it’s roadmap anytime soon – it may warrant creating a custom AI workflow powered by LLMs.
The same applies if the workflow spans multiple systems without clean APIs connecting them (i.e. Zapier or Retool don’t work here). It’s one thing to work with a single provider and ask for functionality that should exist on their roadmap. It’s an entirely different ball game to get multiple systems, multiple providers to work together for your use-case. If this is the case, consider solving with an LLM-based workflow.
Trey points out that a very common path through the “external” framework decision tree is “no, no” and these ideas often take the form of some type of data-searching assistant (i.e. defog.ai). If founders find themselves answering “no, no” (i.e. not a chatbot, and not domain specific), then their first step should be to hunt for an existing solution. There are likely a number of new SaaS products filling this need, and founders should lean on these products instead of rolling their own solution.
Trey also shares that he sees many founders answering “yes, yes” to this framework, stating, “you’d be amazed at how many people don’t want to admit that they could get it to work with ChatGPT and a little bit of engineering.”
The hard truth that many engineers don’t want to admit is that you probably don’t need to train your own model for your use case, and you can likely get it done with ChatGPT. With today’s capabilities, founders are better off experimenting with LLMs like ChatGPT by providing sample inputs and assessing the responses. If it can handle the business domain reasonably well, avoid the temptation to train a custom model from scratch. Trey mentions:
“And the key with all this stuff […] and one of the hardest things with dealing with these models, is to make it as simple as possible. Don’t jump to the most complicated solution.”
As AI capabilities evolve rapidly, evaluating these emerging areas for potential business use cases can be incredibly exciting, but also brings an element of distraction. Trey cautions founders to stay grounded in the core business needs, and thoughtfully approach the decision to build or buy. It’s important to keep it simple and focus on what really benefits the business. Choosing whether to develop your own AI or use existing tools shouldn’t be about chasing the latest trend, but about what makes sense for your startup’s success.
Thanks to Trey for providing these valuable insights during our event and to my partner Joanne, who led Foundation’s early investment in Tennr. Check out his full video below for more.