tracy

Project Url: JetBrains/tracy
Introduction: Tracy: AI Tracing Library for Kotlin and Java
More: Author   ReportBugs   
Tags:

JetBrains incubator project Kotlin Alpha Kotlin GitHub license YouTrack Issues Documentation API Docs

Tracy helps you trace, monitor, and evaluate AI-powered features directly from your Kotlin or Java projects.

It provides a unified API to capture structured traces. Fully compatible with observability platforms like Langfuse and Weights & Biases (W&B).

[!Note] This project uses Tracy Official YouTrack Project for issue tracking. Please file bug reports and feature requests there. Additional details are available in the Contributing Guidelines.

Standards:
This library implements the OpenTelemetry Generative AI Semantic Conventions for span attributes and event naming, ensuring your traces remain compatible with any OpenTelemetry-compliant backend.

You can use it to:

Installation

Select the build system that matches your setup:

Gradle (Kotlin DSL)

Gradle (Kotlin DSL)

  1. Add dependencies to the build.gradle.kts file:

    plugins {
        id("org.jetbrains.ai.tracy") version "0.1.0"
    }
    
    dependencies {
        // Core Module with Shared Functionalities
        implementation("org.jetbrains.ai.tracy:tracy-core:0.1.0")
        // Client-specific Auto Tracing (select the one you need)
        implementation("org.jetbrains.ai.tracy:tracy-anthropic:0.1.0")
        implementation("org.jetbrains.ai.tracy:tracy-gemini:0.1.0")
        implementation("org.jetbrains.ai.tracy:tracy-ktor:0.1.0")
        implementation("org.jetbrains.ai.tracy:tracy-openai:0.1.0")
    }
    
  2. Make sure that you have mavenCentral() in both your plugin management and project repositories sections.

    build.gradle.kts

    repositories {
        mavenCentral()
    }
    

    settings.gradle.kts

    pluginManagement {
        repositories {
            mavenCentral()
        }
    }
    
Gradle (Groovy)

Gradle (Groovy)

  1. Add dependencies to the build.gradle file:

    plugins {
        id 'org.jetbrains.ai.tracy' version '0.1.0'
    }
    
    dependencies {
        implementation 'org.jetbrains.ai.tracy:tracy-core:0.1.0'
        // Client-specific Auto Tracing
        implementation 'org.jetbrains.ai.tracy:tracy-anthropic:0.1.0'
        implementation 'org.jetbrains.ai.tracy:tracy-gemini:0.1.0'
        implementation 'org.jetbrains.ai.tracy:tracy-ktor:0.1.0'
        implementation 'org.jetbrains.ai.tracy:tracy-openai:0.1.0'
    }
    
  2. Make sure that you have mavenCentral() in both your plugin management and project repositories sections.

    build.gradle

    repositories {
        mavenCentral()
    }
    

    settings.gradle

    pluginManagement {
        repositories {
            mavenCentral()
        }
    }
    
Maven

Maven

Add dependencies to the pom.xml file:


<project>
    <!-- ... other project config (groupId, artifactId, version, etc.) ... -->

    <build>
        <plugins>
            <plugin>
                <groupId>org.jetbrains.kotlin</groupId>
                <artifactId>kotlin-maven-plugin</artifactId>
                <version>${kotlin.version}</version>

                <configuration>
                    <jvmTarget>19</jvmTarget>
                </configuration>

                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                </executions>

                <dependencies>
                    <dependency>
                        <groupId>org.jetbrains.ai.tracy</groupId>

                        <!--Use the Kotlin version matching your setup (e.g., 2.0.0 or 2.0.20).
                        Keep the same major and minor numbers but adjust the patch version
                        to the nearest supported value - either 0 or 20. Examples:
                        - 2.1.19 -> 2.1.0
                        - 2.1.23 -> 2.1.20 -->
                        <artifactId>tracy-compiler-plugin-2.1.0-jvm</artifactId>

                        <version>0.1.0</version>
                    </dependency>
                </dependencies>
            </plugin>
        </plugins>
    </build>

    <dependencies>
        <dependency>
            <groupId>org.jetbrains.ai.tracy</groupId>
            <artifactId>tracy-core-jvm</artifactId>
            <version>0.1.0</version>
        </dependency>

        <!-- Client-specific Auto Tracing -->

        <dependency>
            <groupId>org.jetbrains.ai.tracy</groupId>
            <artifactId>tracy-anthropic-jvm</artifactId>
            <version>0.1.0</version>
        </dependency>

        <dependency>
            <groupId>org.jetbrains.ai.tracy</groupId>
            <artifactId>tracy-gemini-jvm</artifactId>
            <version>0.1.0</version>
        </dependency>

        <dependency>
            <groupId>org.jetbrains.ai.tracy</groupId>
            <artifactId>tracy-ktor-jvm</artifactId>
            <version>0.1.0</version>
        </dependency>

        <dependency>
            <groupId>org.jetbrains.ai.tracy</groupId>
            <artifactId>tracy-openai-jvm</artifactId>
            <version>0.1.0</version>
        </dependency>
    </dependencies>

</project>

Requirements

Compatible with Kotlin from 2.0.0 and Java from 17.

If you already have OpenTelemetry set up in your project, versions 1.2+ are supported.

Supports integration with:

  • OpenAI SDK 1.*–4.*
  • Anthropic SDK 1.*–2.*
  • Gemini SDK 1.8.*–1.38.* (earlier versions unsupported)

Usage

[!TIP] See the examples directory for complete and runnable examples!

Tracy provides tracing integrations for multiple clients—not only OpenAI, but also Gemini, Anthropic, Ktor, and OkHttp. All clients can be instrumented in a similar way using the instrument(...) function.

Client Auto Tracing

Tracing Sensitive Content

According to OTEL semantic conventions, capturing of sensitive (e.g., user messages or assistant replies) data should be disabled by default (read more here). Therefore, by default Tracy disguises sensitive content by replacing it with a placeholder "REDACTED", preserving the trace structure and recorded attributes.

You may enable tracing of both input and output independently by following either of the ways below:

  1. Set the environment variables:
TRACY_CAPTURE_INPUT=true|false
TRACY_CAPTURE_OUTPUT=true|false
  1. Override the defaults programmatically in code:
// Call a helper method:
TracingManager.traceSensitiveContent()

// Or equivalently:
TracingManager.withCapturingPolicy(
    ContentCapturePolicy(
        captureInputs = true,
        captureOutputs = true,
    )
)

Tracing Examples

Below is a minimal OpenAI example. For others, check the examples directory:

// Enable tracing via the `IS_TRACY_ENABLED` environment variable
// or programmatically, as shown below:
TracingManager.isTracingEnabled = true
// Initialize tracing and export spans to the console
val sdk: OpenTelemetrySdk = configureOpenTelemetrySdk(
    exporterConfig = ConsoleExporterConfig()
)
TracingManager.setSdk(sdk)

// Permit tracing of sensitive content in requests AND responses
TracingManager.traceSensitiveContent()

// Create an OpenAI client and instrument it with tracing capabilities
val instrumentedClient: OpenAIClient = OpenAIOkHttpClient.builder()
    .baseUrl(url)
    .apiKey(apiKey)
    .timeout(timeout)
    .build()
    .apply { instrument(this) }

// Make request and receive response
val request = ChatCompletionCreateParams.builder()
    .addUserMessage("Generate a polite greeting and introduce yourself.")
    .model(ChatModel.GPT_4O_MINI)
    .temperature(0.0)
    .build()

val response = instrumentedClient.chat().completions().create(request)

println("OpenAI response: ${response.choices().first().message().content().get()}")

// Traces are automatically flushed based on ExporterCommonSettings:
// - Periodically via flushIntervalMs and flushThreshold
// - On JVM shutdown if flushOnShutdown = true (default)
// For manual control, use TracingManager.flushTraces() 

This example uses simple console tracing for a demonstration. For information about other tracing backends, see Tracing Backends

Full example: OpenAI Client Auto Tracing Example

Tracing Configuration

  • Runtime toggle: Spans are emitted only if both conditions are met:

    • Tracing SDK instance has been installed via TracingManager.setSdk(...) Runtime tracing is controlled through TracingManager.isTracingEnabled, which defaults to the value of the IS_TRACY_ENABLED environment variable. if the variable is not set, tracing is disabled by default. You can also change this value programmatically at any time.
  • Compile-time plugin toggle:

    • Annotation-Based Tracing requires the Kotlin Compiler Plugin, and you can enable or disable it at compile time by setting the project property enableTracyPlugin. The property itself is optional. If you don’t define it, the plugin is enabled by default, but you can turn it off whenever needed by setting enableTracyPlugin=false in gradle.properties or by passing it on the command line with -PenableTracyPlugin=false.
  • Configuration:

    • Provide required backend settings to enable trace collection, either via environment variables or constructor parameters. See Tracing Backends.
  • Behavior:

    • SDK missing: returns a no-op tracer; minimal overhead, no errors.
    • SDK set after instrumentation: instrumented clients start emitting spans immediately.
    • Runtime toggle: tracing can be dynamically enabled or disabled via TracingManager.isTracingEnabled.

Annotation-Based Tracing

You can trace regular functions (not only client calls) using the @Trace annotation.

Make sure to apply the org.jetbrains.ai.tracy plugin in your build.

The Kotlin compiler plugin automatically instruments annotated functions, capturing execution details such as start and end time, duration, inputs, and outputs.

⚠️ Annotation-based tracing is supported only in Kotlin. For Java, use Manual Tracing instead.

Quick Start

@Trace(name = "GreetUser")
fun greetUser(name: String): String {
    println("Hello, $name!")
    return "Greeting sent to $name"
}

fun main() {
    // Enable tracing via the `IS_TRACY_ENABLED` environment variable
    // or programmatically, as shown below:
    TracingManager.isTracingEnabled = true
    TracingManager.setSdk(configureOpenTelemetrySdk(ConsoleExporterConfig()))
    greetUser("Alice")
    // Traces are automatically flushed based on ExporterCommonSettings:
    // - Periodically via flushIntervalMs and flushThreshold
    // - On JVM shutdown if flushOnShutdown = true (default)
    // For manual control, use TracingManager.flushTraces()
}

Hierarchical Tracing

Nested calls between annotated functions are traced automatically.
When one traced function invokes another, a hierarchical trace structure is created the outer call is recorded as a parent span and the inner call as its child span.
This structure provides a clear visual representation of call relationships and execution flow.

See the NestedSpansExample.kt for a demonstration of hierarchical tracing.

Tracing in Inherited Classes and Interfaces

Tracing annotation is automatically propagated through interfaces and class hierarchies.
If a method in an interface or superclass is annotated with @Trace, all overriding or implementing methods inherit tracing behavior automatically, even if the annotation is not explicitly declared again.
This approach ensures consistent and reusable tracing across an entire inheritance chain without code duplication.

Refer to the TracingPropagationExample.kt for a complete example.

Customizing Tracing Behavior

The tracing system offers flexible customization through the SpanMetadataCustomizer interface. Implement it as a Kotlin object (classes are not supported) and reference it in the @Trace annotation. With a custom SpanMetadataCustomizer, you can define:

  • How input parameters are serialized.
  • How output results are serialized.
  • How span names are dynamically generated based on runtime data. This provides fine-grained control over how tracing information is represented.

See an example implementation in MetadataCustomizerExample.kt.

Adding Custom Tags

You can enrich your traces with contextual metadata by adding custom tags.
Tags help categorize and filter traces based on your business logic. For example, by user type, feature, or environment. Use the addLangfuseTagsToCurrentTrace function to attach tags dynamically within any traced function.
These tags appear in Langfuse or other tracing tools, making it easier to group and analyze trace data.

An example demonstrating how to provide custom tags can be found in LangfuseTagExample.kt

Manual Tracing

In addition to annotation-based tracing, you can manually create and manage spans wherever you need them in your code.
This approach gives you full control over what is traced and what metadata is recorded. Use the withSpan function to define a traceable block of code.
It automatically creates and closes a span around the specified code section, allowing you to attach structured metadata such as inputs, outputs, or custom attributes. Manual tracing is especially useful in Java projects where annotation-based tracing is not available.

An example demonstrating manual tracing can be found in ManualTracingExample.kt.

Tracing Backends

Tracy provides integrations with multiple tracing backends, allowing you to choose the one that best fits your workflow. Currently supported backends include:

Create an SDK by providing the desired configuration, then initialize tracing with TracingManager.setSdk(sdk)

// Langfuse
val sdk = configureOpenTelemetrySdk(LangfuseExporterConfig())

// Weave
val sdk = configureOpenTelemetrySdk(WeaveExporterConfig())

// Console-Only
val sdk = configureOpenTelemetrySdk(ConsoleExporterConfig())

// File (plain text and JSON formats supported)
val sdk = configureOpenTelemetrySdk(
    FileExporterConfig(
        filepath = myFile.absolutePathString(),
        append = true,
        format = OutputFormat.JSON, // default is `OutputFormat.PLAIN_TEXT`
    )
)

Once the SDK is configured, initialize tracing with TracingManager.setSdk(sdk).

Langfuse Configuration

Property Environment Variable Required Default Value
langfuseUrl LANGFUSE_URL false https://cloud.langfuse.com
langfusePublicKey LANGFUSE_PUBLIC_KEY true -
langfuseSecretKey LANGFUSE_SECRET_KEY true -
traceToConsole - false false
exporterTimeoutSeconds - false 10
maxNumberOfSpanAttributes MAX_NUMBER_OF_SPAN_ATTRIBUTES false 256
maxSpanAttributeValueLength MAX_SPAN_ATTRIBUTE_VALUE_LENGTH false Int.MAX_VALUE

Langfuse Setup Example

Weave Configuration

Property Environment Variable Required Default Value
weaveUrl WEAVE_URL false https://trace.wandb.ai
weaveEntity WEAVE_ENTITY true -
weaveProjectName WEAVE_PROJECT_NAME true -
weaveApiKey WEAVE_API_KEY true -
traceToConsole - false false
exporterTimeoutSeconds - false 10
maxNumberOfSpanAttributes MAX_NUMBER_OF_SPAN_ATTRIBUTES false 256
maxSpanAttributeValueLength MAX_SPAN_ATTRIBUTE_VALUE_LENGTH false Int.MAX_VALUE

Weave Setup Example

Console Configuration

Configuration for exporting OpenTelemetry traces to console only. Console Setup Example

File Configuration

Configuration for exporting OpenTelemetry traces to a file in either JSON or plain text format. File Setup Example

OTLP Configuration

There are Http and gRPC configurations available. Spans can be exported to any OTLP-compatible HTTP or gRPC collector (for example, Jaeger). Jaeger Setup Example

Project Structure

  • plugin — contains the Kotlin compiler plugins for annotation-based tracing.
    It includes multiple Kotlin Compiler Plugin (KCP) implementations for different Kotlin versions.
    The gradle-tracy-plugin module automatically selects the appropriate KCP version for your Kotlin compiler and applies it to your Gradle project.
  • publishing — provides a Gradle plugin used for modules that need to be published.
    It defines the tracy version and includes Kotlin DSL logic for publishing artifacts to Space Maven repositories.
  • examples — contains runnable examples demonstrating how to use various tracy features. capabilities.
  • tracing
    • core — the foundational module providing tracing abstractions, annotations, and OpenTelemetry integration. Manual tracing. Tracing integration for OkHttp client
    • openai — tracing integration for the OpenAI client.
    • gemini — tracing integration for the Gemini client.
    • anthropic — tracing integration for the Anthropic client.
    • ktor — tracing integration for the Ktor HTTP client.
    • test-utils — shared utilities for testing tracing functionality across modules.

Limitations

Span Context Propagation

There are a few known limitations of the library in terms of the content propogation. To see complete examples of the recommended usage patterns, refer to the ContextPropagationExample.kt file.

Kotlin Coroutines

Context propagation works automatically in structured coroutines (e.g., withContext, launch). However, some concurrency models such as runBlocking and raw threads create new execution boundaries and require manual propagation of the OpenTelemetry context.

  • runBlocking inside suspend functions:
    Use currentSpanContextElement(...) to ensure child spans remain linked to their parent. Otherwise, spans become detached and appear as separate traces.

    @Trace
    suspend fun handleRequestInCoroutine(requestId: String) {
        println("Running on thread: ${Thread.currentThread().name}")
    
        // runBlocking without context propagation would detach the trace
        runBlocking(currentSpanContextElement(currentCoroutineContext())) {
            println("Running inside nested coroutine on thread: ${Thread.currentThread().name}")
            processUserRequest(requestId)
        }
    }
    

Multi-Threading

  • Custom threads (via thread { ... }):
    Threads do NOT inherit the OpenTelemetry context automatically.
    Capture and propagate it manually:
    val context = currentSpanContext(currentCoroutineContext())
    thread {
        context.makeCurrent().use { processUserRequest("REQ-303") }
    }
    

Local functions

Avoid using @Trace on local (nested) functions. This is a known Kotlin limitation: references to local functions do NOT implement the KCallable interface correctly. For more details, see the related issue: KT-64873.

Publishing

This project uses Gradle composite builds. Thus, running plain publish or publishToMavenLocal is NOT enough. Included builds are not published automatically.

Use this task instead:

  • Publish to Maven Local
    ./gradlew publishAllToMavenLocal
    

These tasks publish JARs of all modules, all versions of the Kotlin-compiler plugin, and the Gradle plugin.

Versioning

Tracy follows semantic versioning:

<major>.<minor>.<patch>
  • <major>: stays 0 until the library exits beta.
  • <minor>: increases for public releases (first public release: 0.1.0).
  • <patch>: used for internal development iterations and fixes.

Contributing

Read the Contributing Guidelines.

Code of Conduct

This project and the corresponding community are governed by the JetBrains Open Source and Community Code of Conduct. Please make sure you read it.

License

Tracy is licensed under the Apache 2.0 License.

Apps
About Me
GitHub: Trinea
Facebook: Dev Tools