Stack logo
Sync up on the latest from Convex.
Sarah Shader's avatar
Sarah Shader
6 months ago

Log Streams: Common uses

icon of logs and then icon of a stream, to represent log streaming!

With Convex, you can see information about each function executed by Convex, such as whether it succeeded and how long it took to execute, as well as any log lines from console.logs within your functions. These are useful for understanding what your Convex deployment is doing as well as debugging any unexpected issues. Recent events are visible in the dashboard and from the CLI with npx convex logs or with the --tail-logs argument to npx convex dev.

However, you can also set up Log Streams to send these events to Axiom or Datadog.

Log streams give you more control over your logs and errors:

  • Retain historical logs as long as you want (vs. Convex only keeps logs for the last 1000 functions)
  • Add more powerful filtering + data visualizations base on logs
  • Integrate your log streaming platform with other tools (e.g. PagerDuty, Slack)

This article will go over a few common ways to use log streams and how to set them up with either Axiom or Datadog:

  • Replicating the Convex dashboard logs page
  • Filtering to relevant logs by request ID
  • Searching for logs containing a particular string
  • Emitting + filtering namespaced logs with structured metadata
  • Visualizing Convex usage
  • Alerting on approaching Convex limits

How to set up a log stream

Follow our docs to set up a log stream. You’ll need to set up an account for whichever tool you’re using. I’ve personally liked using Axiom for logs and Sentry for exception reporting.

Common ways to use log streams

The full schema of the Convex log events is documented here, and the log stream provider of your choosing will have their own docs on how to filter and visualize data, but in this section, we’ll go through a couple common scenarios.

Recreating the dashboard logs page

The dashboard logs page shows console log lines + function executions sorted by time.

To do this with a log stream, we can filter to logs where topic is either console or function_execution.

Some useful columns to display

  • function.path, function.type, function.request_id
  • For function executions: functon.cached, status, error_message
  • For console events: log_level, message

Since there are different columns for console logs events vs. function execution log events, you might set up two different views for them. Once you have these set up how you want, save the queries or add them to a dashboard for easy use later on.

Below is an example showing console logs in Axiom and an example of showing function executions in Datadog.

Console logs in AxiomConsole logs in Axiom

Function executions in DatadogFunction executions in Datadog

Filtering to a request ID

In the dashboard, clicking on an entry in the logs page will open up a view filtered to that request using the Request ID. You can also do this in Axiom or Datadog by filtering your events further on function.request_id. The request ID shows up in error messages and sentry, so this can be useful for investigating an error found in Sentry or reported by a user.

Request ID filtering in the dashboardRequest ID filtering in the dashboard Request ID in SentryRequest ID in Sentry

Axiom: In the Axiom “Explore” tab with something like this:

1your_dataset
2| where ['data.function.request_id'] == "your request ID here"
3

Datadog: In the Datadog logs page:

1@function.request_id:"your request ID here"
2

Filtering to console events with a particular message

Axiom:

1your_dataset
2| where ['data.topic'] == "console"
3| where ['data.message'] contains "hello"
4

Datadog:

1@message:hello
2

Namespaced logs + structured metadata

As an example, if I have an app where users play games against each other, I might want to log information about each game with some specific attached metadata (like the game ID).

In my Convex functions, I’ll do something like this:

1console.log(JSON.stringify({ 
2	topic: "GAME", 
3	metadata: { gameId: "my game ID here" }, 
4	message: "Started"
5}))
6

Then I can parse these logs in Axiom or Datadog and be able to filter to all events with topic “GAME” with a particular ID.

To make this a little easier, we can make this a helper function:

1function logEvent(topic, metadata, message) {
2	console.log(JSON.stringify({ topic, metadata, message }))
3}
4

Going further, we could use customFunctions to wrap console.log and handle logging these structured events. A usage of this might look something like

1ctx.logger.log(LOG_TOPICS.Game, { gameId }, "Started")
2

An example implementation of ctx.logger and some examples of its usage can be found here.

Axiom:

(optional) Add a virtual field parsed_message so we can use this field in filters. This saves us from having to repeat the parsing logic in our query.

1['your_dataset']
2| extend parsed_message = iff(
3    isnotnull(parse_json(trim("'", tostring(["data.message"])))), 
4    parse_json(trim("'", tostring(["data.message"]))), 
5    parse_json('{}')
6)
7

Adding a virtual field in AxiomAdding a virtual field in Axiom

In the “Explore” page:

1your_dataset
2| where ['data.topic'] == "console"
3| where parsed_message["topic"] == "GAME"
4| where parsed_message["metadata"]["gameId"] == <your id>
5| project ['data.timestamp'], ['data.log_level'], parsed_message["message"]
6

Filtering to logs for a game in AxiomFiltering to logs for a game in Axiom

Datadog:

Add a pipeline with a Grok parser to parse the message field as JSON on all events with the topic as console. I used

1rule '%{data:structured_message:json}'
2

Adding a Grok parser in DatadogAdding a Grok parser in Datadog

Filter logs as follows:

1@structured_message.topic:GAME @structured_message.metadata.gameId:<specific ID>
2

Filtering to logs for a game in DatadogFiltering to logs for a game in Datadog

Note: message is formatted using object-inspect, so printing a string requires removing the outer single quotes.

Visualizing usage

Function executions contain the usage field which can be used to track usage state like database bandwidth and storage per function.

Axiom:

1your_dataset
2| where ['data.topic'] == "function_execution"
3| extend databaseBandwithKb = (todouble(['data.usage.database_read_bytes']) + todouble(['data.usage.database_write_bytes'])) / 1024
4| summarize sum(databaseBandwithKb) by ['data.function.path'], bin_auto(_time)
5

Datadog:

You will want to make this a “measure” for the usage fields you care about and might want to make a “facet” for function.path. Below is an example of making a measure for database_write_bytes.

Defining a measure in DatadogDefining a measure in Datadog

Making a pie chart in DatadogMaking a pie chart in Datadog

Convex system warnings

Convex automatically adds warning messages when a function is nearing limits (e.g. total bytes read, execution time). These have the system_code field which is a short string summarizing the limit. Adding an alert for events with system_code is a good way of automatically detecting functions that are approaching limits before they exceed the limits and break.

An alert in Datadog for Convex system warningsAn alert in Datadog for Convex system warnings

Summary

Log streams like Axiom and Datadog can be used to provide powerful querying and alerting on logs and errors from your Convex functions, helping with debugging issues when they come up and providing early insights to detect smaller issues before they become bigger.

This article covers how to do the following common things with either Axiom or Datadog hooked up as a Convex log stream:

  • Replicating the Convex dashboard logs page, but with more history
  • Filtering to relevant logs by request ID
  • Searching for logs containing a particular string
  • Emitting + filtering namespaced logs with structured metadata
  • Visualizing Convex usage
  • Alerting on approaching Convex limits
Build in minutes, scale forever.

Convex is the sync platform with everything you need to build your full-stack project. Cloud functions, a database, file storage, scheduling, search, and realtime updates fit together seamlessly.

Get started