Data-fetching strategy for Mercari Global Marketplace Web App

Building a robust data-fetching architecture is crucial for modern web applications, especially in a global marketplace web app where performance, type safety, and reliability are paramount.

Hello. I’m @vb, a Web developer from Cross Border (XB) Engineering. In this article, I will share how we implemented our data-fetching strategy using Connect Protocol for making RPCs over HTTP for our web application, in order to adhere to above stated principles.

The Foundation – Why We Chose Connect Protocol

Connect Protocol

For those who are not in the know, Connect Protocol is an HTTP-based RPC (Remote Procedure Call) protocol designed to make API communication more human-readable and debuggable while maintaining compatibility with gRPC.

The protocol maintains semantic compatibility with gRPC, abstracts the transport layer so that we can choose to use either gRPC, Connect or gRPC Web protocol without having to consider the specific behaviors of each.

Please refer to Connect Protocol for more details.

Strategic Drivers for Adoption

Our decision to adopt Connect wasn’t made in isolation—it was driven by several strategic considerations:

  • Backend Alignment (gRPC Consistency): We chose Connect because our Backend-for-Frontend (BFF) service (api service that our web app primarily talks to) is built entirely on the gRPC protocol. Utilizing Connect-node (Connect Protocol library for Node.js) on our Next.js server allows us to make RPCs over HTTP. This consistency across the stack reduces cognitive overhead. (The strategic function and implementation of the BFF is detailed further in subsequent sections.)
Overview of typical data-flow for our application
  • Type Safety from Protocol Buffers: One of the most compelling advantages is automatic type generation from Protocol Buffers definitions. This eliminates the manual work of maintaining TypeScript interfaces and ensures that our frontend types are always in sync with backend contracts. We used Buf CLI to compile Protocol Buffers definitions and generate TypeScript types and other glue code.

  • Reduced Boilerplate: Connect handles service definition, serialization, and deserialization automatically. This means our developers can focus on business logic rather than writing repetitive data transformation code.

The Architecture – Building the Data Access Layer

System Overview and Modular Architecture

Our codebase is structured as a modular monorepo using pnpm workspaces and Turbo. This modular architecture provides clear boundaries between different layers of the application while maintaining a single source of truth. Please refer to my colleague @gary’s article, he added more details about this modular approach.

Our global web application runs on a Next.js server and utilizes this modular architecture. This structure is divided into two major sections to enforce a strict separation of concerns and avoid code duplication:

  • Feature Layer: This layer is responsible for all the application’s user-facing code, encompassing mostly React Server Components.

  • Data Access Layer (DAL): The DAL is the centralized module responsible for abstracting and handling all communication with the BFF service. Our feature modules consume the DAL to fetch the necessary data for rendering components. Please refer to the deep-dive section for more information.

DAL interacts directly with the Backend-for-Frontend (BFF) Service. The BFF acts as a crucial intermediary wrapper for the numerous underlying Backend Microservices. This abstraction layer is strategic, as it enables our web app to optimize data fetching by allowing us to make just one API call per screen to gather all the required data necessary for that specific view.

Overview of major components and their relationships in our data-fetching architecture

Deep Dive into DAL Implementation

The Data Access Layer (DAL) module serves as our essential centralized data access layer. Let’s take a deeper dive into:

The structure of the DAL is comprehensive, providing four main components that streamline data operations:

  • Transport Configuration: It houses the centralized transport mechanism, which is pre-configured with a series of crucial interceptors handling logging, authentication, localization, and platform identification.

  • Data Services: The layer is organized into individual service modules, which are logically grouped by their respective business domains (cart, item-detail, account, etc.).

  • Type Exports: The DAL consumes the TypeScript SDK generated by the Proto-compilation pipeline and re-exports the relevant TypeScript types, ensuring the feature layer remains type-safe. More details of this pipeline can be found in next section.

  • Higher-Order Functions (HOFs): The layer includes various utility functions used to wrap API calls, standardizing common cross-cutting patterns such as authentication failure handling, error handling etc.

Transport Configuration and Interceptor Pipeline

The connect-transport is configured centrally within the DAL using createConnectTransport. This configuration specifies the target URL and defines the pipeline of interceptors that every request must pass through:

// Transport configuration with interceptors
 baseUrl: process.env.BFF_API_URL,
export const transport = createConnectTransport({
 httpVersion: '2',
 interceptors: [logger, authInterceptor, ...otherInterceptors],
});

To manage cross-cutting concerns—those aspects of the request that apply universally across all service calls—we rely on a robust Interceptor Pipeline. These functions automatically execute specialized logic before or after a request is processed, ensuring consistency without requiring repeated code in every service module. Some interceptors used:

  • Logger Interceptor: generates a unique identifier using crypto.randomUUID() and sets it on the request header as X-Request-Id for request tracing across the microservice architecture.

  • Platform Interceptor: identifies the source of the request by setting the X-Platform header to web, required because the same BFF is used by iOS and Android clients too.

  • Auth Interceptor: reads the authentication token from cookie and sets the Authorization header as a Bearer token.

  • Locale Interceptor: determines the region and locale from the host and pathname, setting the X-Region-Code and Accept-Language headers for proper internationalization.

How Features Modules Imports DAL

Feature modules (React Server Components) import specific data services from the DAL, every screen makes one call to the BFF service to fetch data for that screen. Following is an example for our item-detail feature module (responsible for item-detail screen):

// In a React Server Component
import { getItemDetailScreen } from "@dal/data/item-detail.ts";

export default async function ItemDetailPage({
  params,
}: {
  params: { id: string };
}) {
  const itemData = await getItemDetailScreen({ itemId: params.id });
  return <ItemDetailComponent data={itemData} />;
}

The Proto-Compilation Pipeline

Maintaining type consistency and contract integrity is automated through our Proto-compilation pipeline. This pipeline is implemented as a GitHub action that is automatically triggered whenever our gRPC Protocol Buffers (.proto files) are updated in the repository. The automated pipeline performs two critical jobs:

  1. It compiles the updated .proto files into a comprehensive TypeScript SDK.

  2. It publishes this generated TypeScript SDK as an internal npm package.

This ensures that the TypeScript SDK, ready to be consumed by the DAL, always reflects the most recent backend service contracts.

Data Flow: From Client to Server to BFF and Back (Demonstration)

Let’s trace through a complete request cycle for an item detail screen. Please refer to the following sequence diagram as I go through each steps:


  1. User Requests a Page: User navigates to a URL like https://tw.mercari.com/items/<item-id>

  2. Next.js Server Receives Request: The ItemDetailPage component, running on the Next.js server, executes and calls the DAL to fetch the required data.

  3. DAL Fetches Data from gRPC BFF: The DAL invokes the necessary data service, wrapped in higher-order functions.

  4. Request Processing Pipeline: The request goes through our interceptor pipeline: Logger, Platform, Auth, and Locale. The Transport then converts the request to HTTP/2 with a binary protobuf payload.

  5. gRPC BFF Processing: The BFF receives the request with all necessary context headers

    POST <bff-service-url>/<service-name>/<method-name>
    Content-Type: application/proto
    X-Request-Id: <uuid>
    X-Platform: <web>
    Authorization: Bearer <token>
    Accept-Language: <locale>
    X-Region-Code: <region>
  6. Binary Protobuf Response: The BFF returns a binary protobuf response containing the item data.

  7. Deserialization to JavaScript Object: Connect automatically deserializes the binary response back into a readable JavaScript object.

  8. gRPC Status Check and Error handling: Our transport checks the gRPC status in response and handles errors appropriately via our higher-order utility functions.
    For example, following is a wrapper function that gracefully handles a gRPC NotFound error by throwing Next.js’s notFound() error which forces Next.js to render not-found.tsx page.

    // For 404 errors in item-detail
    const withNotFoundRedirect = <TArgs extends unknown[], TReturn>(
    apiCall: (...args: TArgs) => Promise<TReturn>
    ) => {
    return async (...args: TArgs): Promise<TReturn> => {
    try {
      return await apiCall(...args);
    } catch (error) {
      if (error instanceof ConnectError && error.code === Code.NotFound) {
        notFound(); // Next.js 404 handling
      }
      // throws other errors down the chain to be handled by other utility functions or feature components
      throw error;
    }
    };
    };
  9. Render component : Feature components use the data and render the component and pass it to the browser.

Benefits of Our Approach

Our data-fetching architecture, built upon gRPC with the Connect Protocol and centralized through the Data Access Layer (DAL), delivers significant advantages across developer workflow, application performance, system maintainability, and reliability.

  • Developer Experience: By leveraging the automated compilation pipeline, we achieve robust Type Safety, preventing runtime errors through compile-time checks. This also leads to better Auto-completion in IDEs, and the structure enforced by the DAL ensures Consistent Patterns across the application.

  • Performance: This RPC-based architecture delivers tangible Performance gains. Since the application runs in a Next.js Server environment, integrating this strategy directly into the Server Component flow enables effective request Caching using React’s built-in cache() function.

  • Maintainability: Structuring our system around the DAL significantly enhances Maintainability. We achieve Centralized Logic by placing all data access code in one dedicated module, which enforces a strong Separation of Concerns . The resulting modular design means that each layer can be tested independently, simplifying debugging and future updates.

  • Reliability: Finally, reliability is fundamentally improved through predictable architectural components. Our use of interceptors and higher-order functions establishes clear Error Boundaries and Predictable error handling patterns. The system is designed to support Graceful Degradation under adverse circumstances by using proper fallbacks for various error conditions.

Conclusion

Our gRPC with Connect architecture provides a robust foundation for data fetching in our Global marketplace web application. By centralizing data access in the DAL, implementing comprehensive interceptors, and using higher-order functions for common patterns, we’ve created a system that is both powerful and developer-friendly.

The combination of type safety, performance benefits, and consistent error handling makes this architecture well-suited for a complex marketplace application where reliability and user experience are critical.

As we continue to evolve our platform, this architecture provides the flexibility to add new features while maintaining the quality and consistency that our users expect.

If you enjoyed reading this article, please checkout other articles by my team in our series here.

  • X
  • Facebook
  • linkedin
  • このエントリーをはてなブックマークに追加