Integration options

Connecting your commerce ecosystem: An overview of commercetools integration options.

After completing this page, you should be able to:

  • Identify the primary integration methods available within commercetools.
  • Understand the typical use cases and key characteristics of HTTP/GraphQL APIs, the Import API, Subscriptions, and API Extensions.

The core strength of Composable Commerce lies in its flexibility and composability. However, real-world enterprise commerce solutions rarely rely on a single platform. A robust commerce architecture requires seamless integration with a surrounding ecosystem of specialized services.

You will often need to:

  • Import data from external systems like a Product Information Management (PIM) or Enterprise Resource Planning (ERP).
  • Export data to search engines, order management systems (OMS), or marketing platforms.
  • React to events within the platform, such as customer registration or Order creation.
  • Extend platform behavior with custom business logic, like dynamic validation rules.

Choosing the right integration method is crucial for building a scalable, reliable, and maintainable solution. commercetools provides a suite of options, each designed to solve a specific challenge:

  1. HTTP & GraphQL APIs
    • Purpose: Synchronous, real-time data management (create, read, update, delete operations).
    • Typical Use Case: Powering storefronts, mobile apps, and backend integrations.
  2. Import API
    • Purpose: Asynchronous bulk data ingestion.
    • Typical Use Case: Initial data loads and bulk updates of Products, Customers, inventory, and more.
  3. Subscriptions
    • Purpose: Asynchronous, event-driven notifications to react to changes in the platform.
    • Typical Use Case: Triggering downstream workflows like sending order confirmations, updating an ERP.
  4. API Extensions
    • Purpose: Synchronous, blocking modifications or validations of API requests.
    • Typical Use Case: Enforcing complex, custom data validation rules before a resource is created or updated.

To simplify the decision-making process, the following decision tree outlines a streamlined approach to selecting the best integration option for your needs.

Let's explore each option in greater detail.

HTTP and GraphQL APIs

The HTTP and GraphQL APIs are your primary interfaces for real-time interaction with the platform. Use them to build dynamic, responsive experiences by accessing and manipulating your data synchronously. Common use cases include powering Product detail pages, managing shopping Carts, exporting Product data, importing inventory, and creating Orders.

  • HTTP API: A traditional RESTful API that follows standard principles, making it a familiar and powerful tool for any server-side application performing create, read, update, and delete (CRUD) operations on platform resources.
  • GraphQL API: Offers a flexible query language to fetch precisely the data you require in a single request. This efficiency minimizes payload size and reduces the number of API calls, making it ideal for performance-sensitive applications like a Backend-for-Frontend (BFF). It also fully supports mutations for creating, updating, and deleting data.

API best practices

To build a robust, scalable, and performant client for our APIs, we recommend following these best practices. While critical for bulk operations, these principles benefit any application built on our platform.

  • Prefer cursor-based pagination: For iterating through large result sets, use cursor-based pagination instead of offset-based pagination. Cursors provide consistent, high performance regardless of data volume, whereas offset performance degrades as the offset value increases. For this reason, queries are limited to a maximum of 10,000 records when using the offset parameter.
  • Write efficient predicates: To ensure fast query performance, design your predicates to use indexes efficiently. A best practice is to place the most restrictive condition (the one that filters the most items) first in your where clause and use efficient operators. See our performance considerations guide for more details.
  • Query for changes using timestamps: When you only need to retrieve resources that have changed recently, filter on the lastModifiedAt field with a specific time window. This is significantly more efficient than fetching an entire dataset to check for updates. Timestamps are also excellent candidates for cursor-based pagination due to their high cardinality.
  • Implement traffic ramp-up and backoff strategies: Your application must gracefully handle 502 Bad Gateway and 503 Service Unavailable errors. Implement a backoff strategy, progressively increasing the delay between retries. Avoid sudden, large spikes in traffic; when initiating a high-volume process, gradually increase the number of requests per second. For planned high-traffic events, please contact our support team in advance.

Import API

The Import API is purpose-built for ingesting bulk datasets such as Product catalogs, Customer lists, and inventory updates. It is the optimal choice for bulk operations, especially when data originates from multiple systems.

  • Asynchronous: Import jobs are processed in the background, making it ideal for large volumes of data that do not require an immediate response.
  • Dependency-Aware: This is the Import API's key differentiator. It allows you to import related data out of order, and the API will automatically resolve the dependencies and link the resources together.
How It Works: Imagine your Product data resides in an ERP, while its corresponding images are in a separate digital asset management (DAM) system. You can submit the Products for import before their images exist in the Platform. Later, when you import the images, the API will automatically find and link them to the correct Products. This eliminates the need for complex, sequential orchestration in your client application.

The high-level workflow is as follows:

  1. Create an Import Container: Define a logical container to group related import operations.
  2. Submit Import Operations: Send your data (for example, Products, Prices) as import operations to the container.
  3. Asynchronous Processing: commercetools processes the data in the background, resolving dependencies as data becomes available.
  4. Check Import Status: Query the API to check the status of each import operation (for example, Success, ValidationFailed) via an import summary.

This makes the Import API the ideal choice when:

  • Performing an initial Product catalog load.
  • Migrating customers from another platform.
  • Importing dependent resources (for example, Categories > Products > Product Variants > Prices) without enforcing a strict order.

Limitations and considerations

  • Asynchronous nature: Data changes are not immediate. For real-time updates or immediate removal of incorrect data, use synchronous CRUD operations via the HTTP or GraphQL API.
  • 48-hour operation lifetime: Import operations are held for up to 48 hours while awaiting unresolved dependencies. While beneficial, this means temporary data inconsistencies are possible during this window. Ensure all referenced data is made available within this timeframe.

The importance of keys

Using the key attribute on your resources is critical for effective data management with the Import API.
  • Stable Identification for Updates: The Import API uses the key attribute to identify existing resources for updates. If you import data with a key that matches an existing resource, the API updates that resource instead of creating a new one. Without keys, you risk creating duplicate entries.
  • Idempotency: Keys make your import operations idempotent. If an import is interrupted and retried, the system will correctly update existing resources without creating duplicates, ensuring the desired state is eventually reached.
  • Dependency Resolution: Keys are vital for resolving dependencies. For example, when importing Products, you can reference their Categories by key (using a KeyReference). The Import API will then correctly link the resources.
  • Merchant Center Compatibility: The Merchant Center's import functionality also relies on keys to identify and update existing resources. Without keys, the Merchant Center import process will create new resources instead of modifying existing ones.

Subscriptions

Subscriptions are a core feature for building event-driven architectures. They notify your external systems of real-time events, such as an Order being placed or a Cart being updated. When a subscribed event occurs, commercetools sends a message to a configured destination, typically a message queue like AWS SQS, Google Cloud Pub/Sub, or Azure Service Bus.

This asynchronous, event-driven pattern allows your applications to react to changes dynamically without the need for constant polling, creating efficient, decoupled, and scalable integrations.

The high-level process involves these key steps:

  1. Select Messages: Choose the specific events you want to be notified about (for example, OrderCreated, ProductPublished).
  2. Configure a Destination: Specify the message queue where commercetools should send event messages. See the full list of supported destinations.
  3. Create the Subscription: Create the Subscription resource in your Project, linking your selected messages to your destination.
  4. Process the Messages: Build a service that consumes messages from the queue and executes your business logic.

Important architectural considerations

To build a resilient system using Subscriptions, keep these principles in mind.

  • Delivery Guarantees and Idempotency
    • The platform guarantees at-least-once delivery. This means a message could, in rare failure scenarios, be delivered more than once.
    • Your message consumer logic must be idempotent. Processing the same message multiple times should not result in duplicate actions or errors (for example, sending two emails for one OrderCreated message).
  • Message Ordering
    • Strict chronological ordering of messages is not guaranteed across the entire system.
    • If the order of events for a specific resource is critical, use the sequenceNumber field on the message. Your application can use this number to process messages in the correct sequence.
    • Be aware of the ordering guarantees of your chosen queue technology (for example, standard AWS SQS does not guarantee order, whereas SQS FIFO Queues do).
  • Error Handling and Dead-Letter Queues (DLQ)
    • Your consumer will inevitably fail to process some messages. The best practice is to configure a Dead-Letter Queue (DLQ) on your message queue.
    • After a message fails processing a set number of times, the queueing service will move it to the DLQ. This prevents a single problematic message (a "poison pill") from blocking the entire queue, allowing you to inspect and retry failed messages later.
  • Scalability and Performance
    • Design your consumer to handle sudden bursts of messages, such as during a flash sale.
    • Establish and monitor a Service Level Objective (SLO) for your processing time (for example, "99% of OrderCreated messages must be processed within 60 seconds").
  • Subscription Limits and the Fan-Out Pattern
    • A Project is limited to a maximum of 50 Subscriptions.
    • To avoid hitting this limit, use the fan-out pattern. Create fewer, broader Subscriptions (for example, one for all Order related messages). Use a service like AWS SNS to distribute a single message to multiple downstream consumers, each performing a different task.

Message payload and truncation

By default, Subscriptions use the PlatformFormat, which includes the full resource payload (for example, the entire Order object) in the message. This is convenient, as your consumer can act on the event without making an API call back to commercetools.
However, all messaging systems have size limits (for example, 256 KB for AWS SQS). To prevent message loss for large resources, the Platform has a built-in safety mechanism: automatic message truncation. If a message payload is too large, the platform will truncate it, preserving the metadata but removing the resource body.

A truncated message will look like this:

{
  "notificationType": "ResourceUpdated",
  "projectKey": "my-project-key",
  "resource": {
    "typeId": "order",
    "id": "a1b2c3d4-e5f6-4a7b-8c9d-0e1f2a3b4c5d"
  },
  "sequenceNumber": 42,
  "resourceVersion": 5
}

Best practice: Always support fetching the full resource

Because any message could be truncated, you must design your consumer to handle this case. For maximum resilience, adopt a "fetch-on-receive" pattern:

  1. Check for the Body: When your consumer receives a message, check if the full resource body is present.
  2. Fetch if Missing: If the body is missing, the message was truncated. Use the resource.typeId and resource.id to make a GET request to the corresponding API endpoint.
  3. Process the Fetched Data: Use the full, up-to-date resource returned by the API for your business logic.

This pattern prevents data loss.

API Extensions

API Extensions allow you to inject custom business logic directly into the API processing pipeline. Unlike Subscriptions, API Extensions are synchronous. They act as blocking webhooks that can validate, modify, or even reject an API request before it is finalized.

This makes them a powerful tool for enforcing complex business rules and ensuring data integrity at the point of entry, regardless of which client application (for example, Merchant Center, mobile app, PIM) is making the API call.

The API Extension flow

The diagram below illustrates the synchronous, blocking nature of an API Extension. The original API call is paused while commercetools waits for a response from your external service.

Choosing the right use case: A risk-based approach

The decision to use an API Extension should be made carefully. An unreliable or slow Extension application can directly harm your user experience and system availability.

  • Low-Risk / Recommended Use Cases: These scenarios typically occur in the back office or involve processes that are not on the critical customer path. For example, enforcing complex data quality rules (for example, image count, attribute completeness) on Products before they can be published from the Merchant Center.
  • High-Risk / Advanced Use Cases: These scenarios directly impact the real-time customer experience (for example, cart modifications, checkout validation). Any performance degradation will immediately affect conversion rates. Implementing an Extension for these critical paths requires a commitment to operational excellence.

Technical prerequisites for critical-path Extensions

To safely use an API Extension in a customer-facing flow, your service must meet the following standards:

RequirementRationale & Best Practices
Guaranteed Low LatencyYour service's response time is added directly to the API call. Aim for P99 response times consistently under 200ms.
Geographic Co-locationDeploy your Extension in the same cloud provider and region as your Project to minimize network latency. Consider using Private Service Connect.
Elimination of Cold StartsIf using serverless functions, ensure they are always "warm" to prevent provisioning delays that can cause timeouts.
High-Performance RuntimeUse a language and framework known for low-latency startup and execution. Avoid heavyweight frameworks.
Autonomous ScalabilityYour service must automatically scale to handle peak traffic without performance degradation.
Minimal External CallsMinimize calling other external services from within an Extension. Each outbound call adds latency and a point of failure.

Alternative: Application-side logic (BFF pattern)

For many real-time scenarios, a more flexible and resilient approach is to place the business logic in your application's backend or BFF before calling the Composable Commerce API. In this pattern, your frontend calls your BFF, which executes the logic (for example, calls a fraud service) and then constructs the final, valid API call to Composable Commerce.

Benefits of application-side logic over API Extensions

  • Greater control over UX: Handle failures gracefully in your application (for example, show a specific error message) instead of relying on a generic API error.
  • No hard timeout: You are not constrained by the strict 2-second timeout of an API Extension.
  • Improved resilience: An outage in a third-party service can be managed within your application without blocking the entire checkout flow.
  • Simplified architecture: The logic lives within your existing application stack, making it easier to develop, test, and debug.
  • More flexibility: Conditionally apply logic based on the full context of the user session, A/B tests, or other application-level state.

Decision framework: Choosing the right pattern

commercetools Composable Commerce offers several powerful integration options to connect with external systems:

  • HTTP and GraphQL APIs provide efficient, synchronous data management for real-time CRUD operations.
  • The Import API handles asynchronous, high-volume data ingestion for bulk create or update operations.
  • Subscriptions enable near real-time, event-driven integrations for reactive workflows.
  • API Extensions allow for synchronous, custom request modification and validation across all API clients.

Use this table to help decide which integration pattern best fits your needs.

CharacteristicApplication-side logic (BFF)API ExtensionSubscription
ExecutionSynchronous (in your app)Synchronous (in Composable Commerce)Asynchronous
When to useDefault choice for most real-time logic. Validation, enrichment, or calculations needed before an API call.Specialized cases. When you must enforce a rule for all API clients (for example, Merchant Center, mobile app) and cannot place logic in each one.Post-processing. Reacting to an event after it has successfully completed.
Impact on API callLogic is applied before the call is made.Can modify or reject the request before it completes.No impact. Reacts to a successful change.
Performance impactLatency is managed within your application's control.Directly adds latency to the commercetools API call.No impact on the original API call's response time.
Failure modeHandled within your application; can be designed for high resilience.Can block the API call and cause outages if not built to high standards.Handled via a dead-letter queue; does not affect the user.
To help manage the security, maintenance, and cost associated with building and hosting services for Subscriptions or API Extensions, consider utilizing commercetools Connect.

Test your knowledge