Reasoning Patterns in LLM: CoT & ReAct
A direct prompt works well with LLM for a simple task, but when a task needs multiple steps to arrive at a particular solution, a direct prompt is likely to fail because of LLM hallucination and assum
Search for a command to run...
Series
A technical series to build AI-powered systems with production ready strategies.
A direct prompt works well with LLM for a simple task, but when a task needs multiple steps to arrive at a particular solution, a direct prompt is likely to fail because of LLM hallucination and assum
The moment we interact with LLMs, we get probabilistic output. They'll return "price": "$45.99" one time and "price": 45.99 the next. Sometimes, they even forget required fields. This might not look l
Observability is an essential topic in modern software development. It is particularly useful when you move the application to the production. Observability in AI application is more extended than traditional application because you need to observe t...
Overview The first way to interact with any LLM is through the API in development. This is pretty basic which you will master easily but you can miss some critical aspect which I will cover in this post. Part 1: Understand Tokens Before Getting Excit...