Skip to main content

Arize

Arize is a tool built on OpenTelemetry and OpenInference for monitoring and optimizing LLM applications.

To enable Arize tracing, set the required Arize environment variables in your SkillFlaw deployment. Arize begins monitoring and collecting telemetry data from your LLM applications automatically.

tip

Instructions for integrating SkillFlaw and Arize are also available in the Arize documentation:

  • SkillFlaw tracing with Arize Platform
  • SkillFlaw tracing with Arize Phoenix

Prerequisites

Connect Arize to SkillFlaw

  1. In your Arize dashboard, copy your Space ID and API Key (Ingestion Service Account Key).

  2. In the root of your SkillFlaw application, edit your existing SkillFlaw .env file or create a new one.

  3. Add ARIZE_SPACE_ID and ARIZE_API_KEY environment variables:


    _10
    ARIZE_SPACE_ID=SPACE_ID
    _10
    ARIZE_API_KEY=API_KEY

    Replace SPACE_ID and API_KEY with the values you copied from the Arize platform.

    You don't need to specify the Arize project name if you're using the standard Arize platform.

  4. Start your SkillFlaw application with your .env file:


    _10
    uv run skillflaw run --env-file .env

Run a flow and view metrics in Arize

  1. In SkillFlaw, run a flow that has an LLM-driven component, such as an Agent component or any language model component. You must chat with the flow or trigger the LLM to produce traffic for Arize to trace.

    For example, you can create a flow with the Simple Agent template, add your OpenAI API key to the Agent component, and then click Playground to chat with the flow and generate traffic.

  2. In Arize, open your project dashboard, and then wait for Arize to process the data. This can take a few minutes.

  3. To view metrics for your flows, go to the LLM Tracing tab.

    Each SkillFlaw execution generates two traces in Arize:

    • The AgentExecutor trace is the Arize trace of LangChain's AgentExecutor.
    • The UUID trace is the trace of the SkillFlaw components.
  4. To view traces, go to the Traces tab.

    A trace is the complete journey of a request, made of multiple spans.

  5. To view spans, go to the Spans tab.

    A span is a single operation within a trace. For example, a span could be a single API call to OpenAI or a single function call to a custom tool.

    For information about tracing metrics in Arize, see the Arize LLM tracing documentation.

  6. To add a span to a dataset, click Add to Dataset.

    All metrics on the LLM Tracing tab can be added to datasets.

  7. To view a dataset, click the Datasets tab, and then select your dataset.