Mirascope Integration
Mirascope (GitHub (opens in a new tab)) is a Python toolkit for building LLM applications.
Developing LLM-applications with Mirascope feels just like writing the Python code you’re already used to. Python Toolkit for LLMs: Mirascope simplifies the development of applications using Large Language Models (LLMs) by providing an intuitive interface similar to standard Python coding practices.
The Mirascope team recorded a video with a short demo of how to use Mirascope with Langfuse.
How to use Mirascope with Langfuse
from mirascope.langfuse import with_langfuseMirascope automatically passes the Langfuse observe() decorator to all relevant functions within Mirascope via its with_langfuse decorator.
Example
Call
This is a basic call example but will work with all of Mirascope's call functions, call, stream, call_async, stream_async.
import os
from mirascope.langfuse import with_langfuse
from mirascope.anthropic import AnthropicCall
 
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..."
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com"
 
 
@with_langfuse
class BookRecommender(AnthropicCall):
    prompt_template = "Please recommend some {genre} books"
 
    genre: str
 
 
recommender = BookRecommender(genre="fantasy")
response = recommender.call()  # this will automatically get logged with langfuse
print(response.content)
#> Here are some recommendations for great fantasy book series: ...This will give you:
- A trace around the AnthropicCall.call() that captures items like the prompt template, and input/output attributes and more.
- Human-readable display of the conversation with the agent
- Details of the response, including the number of tokens used

Extract
Mirascope's extraction functionality is built on top of Pydantic and offers a convenient
extractmethod on extractor classes to extract structured information from LLM outputs. This method leverages tools (function calling) to reliably extract the required structured data. (Docs (opens in a new tab))
import os
from mirascope.langfuse import with_langfuse
from mirascope.openai import OpenAIExtractor
 
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..."
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com"
 
class TaskDetails(BaseModel):
    description: str
    due_date: str
    priority: Literal["low", "normal", "high"]
 
 
@with_langfuse
class TaskExtractor(OpenAIExtractor[TaskDetails]):
    extract_schema: Type[TaskDetails] = TaskDetails
    prompt_template = """
    Extract the task details from the following task:
    {task}
    """
 
    task: str
 
 
task = "Submit quarterly report by next Friday. Task is high priority."
task_details = TaskExtractor(
    task=task
).extract()  # this will be logged automatically with langfuse
assert isinstance(task_details, TaskDetails)
print(task_details)
# > description='Submit quarterly report' due_date='next Friday' priority='high'This will give you the same view as you would get from using the OpenAI integration. We will be adding more extraction support for other providers soon.
