Skip to main content
With Webhooks and ETL Pipelines, Vital pushes any new data or updates to your endpoint or your ETL destination as soon as they are discovered.

Connection Created stage

When a user successfully connects a provider through Junction Link, Vital sends a Provider Connection Created event to acknowledge it:

Event Type

provider.connection.created

Example

Check out the Provider Connection Created event in the Event Catalog.
Webhook Event
{
  "event_type": "provider.connection.created",
  "data": {
    "user_id": "4eca10f9-...",
    "provider": {
      "name": "Strava",
      "slug": "strava",
      "logo": "https://storage.googleapis.com/vital-assets/strava.png"
    },
    "resource_availability": { ... }
  }
}

Historical Data Backfill stage

Vital also schedules jobs to backfill historical data upon connection establishment. Historical data are data that exist prior to the connection establishment. Because this can amount to months of data — and might potentially run into temporary provider-side API rate limits — these backfill jobs may take minutes to hours to complete.
You can inspect the status of historical data backfill jobs of each individual user through Junction Dashboard or Junction API.
Once the historical data backfill job of a specific resource has fully completed, Vital sends a Historical Pull Completion event for the resource:

Event Type

historical.data.{RESOURCE}.created

Usage

This event is a data-less notification.Use the Vital Data Access API to fetch the resource you are interested in.Constrain your API call using the information in the Historical Pull Completion event. For example, the event includes the provider slug as well as full datetime range which Vital has looked through.

Example

Check out the Historical Workout Data Pull Completion event in the Event Catalog.
{
  "event_type": "historical.data.workouts.created",
  "data": {
    "user_id": "e9e072e8-...",
    "start_date": "2020-06-21T08:23:01+00:00",
    "end_date": "1996-11-02T14:39:28+00:00",
    "provider": "zwift"
  }
}
ETL Pipelines can opt-in to receive historical data as Vital Data Events (as outlined below in the Incremental Data stage section).Note that Vital would still send the Historical Pull Completion event to signal the completion of the historical data backfill stage.

Incremental Data stage

Once the historical data backfill stage completes, Vital monitors for new data and changes. Whenever new data or changes are discovered, Vital sends them out as Data Events as soon as possible:
You can inspect the state of incremental data updates of each individual user through Junction Dashboard or Junction API.

Event Types

  • Initial discovery — daily.data.{RESOURCE}.created
  • Subsequent updates — daily.data.{RESOURCE}.updated
Treat the daily.data prefix as a misnomer for Incremental Data. It does not imply a daily delivery schedule.

Usage

Vital sends out Data Events as soon as we discover new or updated entries as reported by a provider. This also implies:
  1. Vital does not prioritize or aggregate Data Events across providers:
    • You may receive multiple events of the same type from a single provider, because the provider supports multiple sources, devices or apps (e.g., iPhone sleeps and Apple Watch sleeps from Apple HealthKit);
    • You may receive multiple events of the same type from multiple providers, because the user has connected to multiple providers.
  2. There is no standard Data Event delivery frequency or schedule across providers and resources:
    • Some may be sent only once per day;
    • Some may be sent multiple times at irregular times;
    • Some may appear to be sent on a regular schedule throughout the day.
Your system typically would ingest relevant Vital Data Events into a database.You can query the database with any business rules and query constraints suitable to your product experience. This may include your own data prioritization rules based on, e.g., Source Type.
Vital also offers the Horizon AI Aggregation API for your data aggregation and consolidation needs.

Example

Check out the Workout Created event in the Event Catalog.
The event structure and deduplication semantic of summary types and timeseries types have a few key differences. Please refer to Event Structure for a more in-depth discussion.
{
  "event_type": "daily.data.workouts.created",
  "data": {
    "id": "c90bd6fb-...",
    "average_hr": 198,
    "distance": 8544,
    "time_start": "2012-11-24T22:57:01+00:00",
    "time_end": "2017-04-22T04:57:31+00:00",
    "calories": 4470,
    "hr_zones": [8279],
    "user_id": "ab3247dc-...",
    ...,
    "provider_id": "dolor ipsum reiciendis Lorem veniam elit. esse",
    "source": {
      "provider": "zwift",
      "type": "unknown"
    }
  }
}

Deep Dive into Vital Webhooks

Managing Webhooks

You can manage your team’s webhooks programmatically using our Webhooks API. This API allows you to:
  • CRUD (create/read/update/delete) your webhooks
  • Manage webhook headers
  • Update webhook secrets

Authenticating Webhooks

You can verify the webhook signature. Each webhook request we send includes these three headers:
  • svix-id
  • svix-timestamp
  • svix-signature
You’ll also need your webhook secret, which you can find in your Junction Dashboard > Webhooks > Endpoint tabs under “Signing Secret”. Option 1: Use a Svix SDK If you’re using a compatible language, the easiest way is to utilize the official Svix SDKs, which handle signature verification for you. Option 2: Manual Verification If the Svix SDKs are not an option, you can manually verify them following the instructions here.