# 01: Creating Your First Listener Source: https://flatfile.com/docs/coding-tutorial/101-your-first-listener/101.01-first-listener Learn to set up a basic Listener with Space configuration to define your data structure and workspace layout. If you aren't interested in a code-forward approach, we recommend starting with [AutoBuild](/getting-started/quickstart/autobuild.mdx), which uses AI to analyze your template or documentation and then automatically creates and deploys a [Blueprint](/core-concepts/blueprints) (for schema definition) and a [Listener](/core-concepts/listeners) (for validations and transformations) to your [Flatfile App](/core-concepts/apps). Once you've started with AutoBuild, you can always download your Listener code and continue building with code from there! ## What We're Building In this tutorial, we'll build a foundational Listener that handles Space configuration - the essential first step for any Flatfile implementation. Our Listener will: * **Respond to Space creation**: When a user creates a new [Space](/core-concepts/spaces), our Listener will automatically configure it * **Define the Blueprint**: Set up a single [Workbook](/core-concepts/workbooks) with a single [Sheet](/core-concepts/sheets) with [Field](/core-concepts/fields) definitions for names and emails that establishes the data schema for the Space * **Handle the complete Job lifecycle**: Acknowledge, update progress, and complete the configuration [Job](/core-concepts/jobs) with proper error handling * **Provide user feedback**: Give real-time updates during the configuration process This forms the foundation that you'll build upon in the next parts of this series, where we'll add user Actions and data validation. By the end of this tutorial, you'll have a working Listener that creates a fully configured workspace ready for data import. ## Prerequisites Before we start coding, you'll need a Flatfile account and a fresh project directory: 1. **Create a new project directory**: Start in a fresh directory for this tutorial (e.g., `mkdir my-flatfile-listener && cd my-flatfile-listener`) 2. **Sign up for Flatfile**: Visit [platform.flatfile.com](https://platform.flatfile.com) and create your free account 3. **Get your credentials**: You'll need your Secret Key and Environment ID from the [Keys & Secrets](https://platform.flatfile.com/dashboard/keys-and-secrets) section later in this tutorial **New to Flatfile?** If you'd like to understand the broader data structure and concepts before diving into code, we recommend reading through the [Core Concepts](/core-concepts/overview) section first. This covers the foundational elements like [Environments](/core-concepts/environments), [Apps](/core-concepts/apps), and [Spaces](/core-concepts/spaces), as well as our data structure like [Workbooks](/core-concepts/workbooks) and [Sheets](/core-concepts/sheets), and how they all work together. Each Listener is deployed to a specific Environment, allowing you to set up separate Environments for development, staging, and production to safely test code changes before deploying to production. ## Install Dependencies Choose your preferred language and follow the setup steps: ```bash JavaScript # Initialize project (skip if you already have package.json) npm init -y # Install required Flatfile packages npm install @flatfile/listener @flatfile/api # Note: Feel free to use your preferred JavaScript project setup method instead ``` ```bash TypeScript # Initialize project (skip if you already have package.json) npm init -y # Install required Flatfile packages npm install @flatfile/listener @flatfile/api # Install TypeScript dev dependency npm install --save-dev typescript # Initialize TypeScript config (skip if you already have tsconfig.json) npx tsc --init # Note: Feel free to use your preferred TypeScript project setup method instead ``` ### Authentication Setup For this step, you'll need to get your Secret Key and environment ID from your [Flatfile Dashboard](https://platform.flatfile.com/dashboard/keys-and-secrets). Then create a new file called `.env` and add the following (populated with your own values): ```bash # .env FLATFILE_API_KEY="your_secret_key" FLATFILE_ENVIRONMENT_ID="us_env_your_environment_id" ``` ## Create Your Listener File Create a new file called `index.js` for Javascript or `index.ts` for TypeScript: ```javascript JavaScript import api from "@flatfile/api"; export default function (listener) { // Configure the Space when it's created listener.on("job:ready", { job: "space:configure" }, async (event) => { const { jobId, spaceId } = event.context; try { // Acknowledge the job await api.jobs.ack(jobId, { info: "Setting up your workspace...", progress: 10, }); // Create the Workbook with Sheets, creating the Blueprint for the space await api.workbooks.create({ spaceId, name: "My Workbook", sheets: [ { name: "contacts", slug: "contacts", fields: [ { key: "name", type: "string", label: "Full Name" }, { key: "email", type: "string", label: "Email" }, ], }, ], }); // Update progress await api.jobs.update(jobId, { info: "Workbook created successfully", progress: 75, }); // Complete the job await api.jobs.complete(jobId, { outcome: { message: "Workspace configured successfully!", acknowledge: true, }, }); } catch (error) { console.error("Error configuring Space:", error); // Fail the job if something goes wrong await api.jobs.fail(jobId, { outcome: { message: `Failed to configure workspace: ${error.message}`, acknowledge: true, }, }); } }); } ``` ```typescript TypeScript import type { FlatfileListener } from "@flatfile/listener"; import api from "@flatfile/api"; export default function (listener: FlatfileListener) { // Configure the Space when it's created listener.on("job:ready", { job: "space:configure" }, async (event) => { const { jobId, spaceId } = event.context; try { // Acknowledge the job await api.jobs.ack(jobId, { info: "Setting up your workspace...", progress: 10 }); // Create the Workbook with Sheets, creating the Blueprint for the space await api.workbooks.create({ spaceId, name: "My Workbook", sheets: [ { name: "contacts", slug: "contacts", fields: [ { key: "name", type: "string", label: "Full Name" }, { key: "email", type: "string", label: "Email" }, ], }, ], }); // Update progress await api.jobs.update(jobId, { info: "Workbook created successfully", progress: 75 }); // Complete the job await api.jobs.complete(jobId, { outcome: { message: "Workspace configured successfully!", acknowledge: true } }); } catch (error) { console.error("Error configuring Space:", error); // Fail the job if something goes wrong await api.jobs.fail(jobId, { outcome: { message: `Failed to configure workspace: ${error instanceof Error ? error.message : 'Unknown error'}`, acknowledge: true } }); } }); } ``` **Complete Example**: The full working code for this tutorial step is available in our Getting Started repository: [JavaScript](https://github.com/FlatFilers/getting-started/tree/main/101.01-first-listener/javascript) | [TypeScript](https://github.com/FlatFilers/getting-started/tree/main/101.01-first-listener/typescript) ## Project Structure After creating your Listener file, your project directory should look like this: ```text JavaScript my-flatfile-listener/ ├── .env // Environment variables ├── index.js // Listener code | | /* Node-specific files below */ | ├── package.json ├── package-lock.json └── node_modules/ ``` ```text TypeScript my-flatfile-listener/ ├── .env // Environment variables ├── index.ts // Listener code | | /* Node and Typescript-specific files below */ | ├── package.json ├── package-lock.json ├── tsconfig.json └── node_modules/ ``` ### Authentication Setup You'll need to get your Secret Key and Environment ID from your [Flatfile Dashboard](https://platform.flatfile.com/dashboard/keys-and-secrets) to find both values, then add them to a `.env` file: ```bash # .env FLATFILE_API_KEY="your_secret_key" FLATFILE_ENVIRONMENT_ID="us_env_your_environment_id" ``` ## Testing Your Listener ### Local Development To test your Listener locally, you can use the `flatfile develop` command. This will start a local server that implements your custom Listener code, and will also watch for changes to your code and automatically reload the server. ```bash # Run locally with file watching npx flatfile develop ``` ### Step-by-Step Testing After running your listener locally: 1. Create a new space in your Flatfile environment 2. Observe as the new space is configured with a Workbook and Sheet ## What Just Happened? Your Listener is now ready to respond to Space configuration Events! Here's how the space configuration works step by step: ### 1. Exporting your Listener function This is the base structure of your Listener. At its core, it's just a function that takes a `listener` object as an argument, and then uses that listener to respond to Events. ```javascript JavaScript export default function (listener) { // . . . code } ``` ```typescript TypeScript export default function (listener: FlatfileListener) { // . . . code } ``` ### 2. Listen for Space Configuration When a new Space is created, Flatfile automatically triggers a `space:configure` job that your Listener can handle. This code listens for that job using the `job:ready` Event, filtered by the job name `space:configure`. ```javascript JavaScript listener.on("job:ready", { job: "space:configure" }, async (event) => { // . . . code }); ``` ```typescript TypeScript listener.on("job:ready", { job: "space:configure" }, async (event) => { // . . . code }); ``` ### 3. Acknowledge the Job The first step is always to acknowledge that you've received the job and provide initial feedback to users. From this point on, we're responsible for the rest of the job lifecycle, and we'll be doing it all in this Listener. For more information on Jobs, see the [Jobs](/core-concepts/jobs) concept. ```javascript JavaScript await api.jobs.ack(jobId, { info: "Setting up your workspace...", progress: 10, }); ``` ```typescript TypeScript await api.jobs.ack(jobId, { info: "Setting up your workspace...", progress: 10 }); ``` ### 4. Define the Blueprint Next, we create the workbook with sheets and field definitions. This **is** your [Blueprint](/core-concepts/blueprints) definition—establishing the data schema that will govern all data within this Space. ```javascript JavaScript await api.workbooks.create({ spaceId, name: "My Workbook", sheets: [ { name: "contacts", slug: "contacts", fields: [ { key: "name", type: "string", label: "Full Name" }, { key: "email", type: "string", label: "Email" }, ], }, ], }); ``` ```typescript TypeScript await api.workbooks.create({ spaceId, name: "My Workbook", sheets: [ { name: "contacts", slug: "contacts", fields: [ { key: "name", type: "string", label: "Full Name" }, { key: "email", type: "string", label: "Email" }, ], }, ], }); ``` ### 5. Update Progress Keep users informed about what's happening during the configuration process. ```javascript JavaScript await api.jobs.update(jobId, { info: "Workbook created successfully", progress: 75, }); ``` ```typescript TypeScript await api.jobs.update(jobId, { info: "Workbook created successfully", progress: 75 }); ``` ### 6. Complete the Job Finally, mark the job as complete with a success message, or fail it if something went wrong. ```javascript JavaScript // Success case await api.jobs.complete(jobId, { outcome: { message: "Workspace configured successfully!", acknowledge: true, }, }); // Failure case await api.jobs.fail(jobId, { outcome: { message: `Failed to configure workspace: ${error.message}`, acknowledge: true, }, }); ``` ```typescript TypeScript // Success case await api.jobs.complete(jobId, { outcome: { message: "Workspace configured successfully!", acknowledge: true } }); // Failure case await api.jobs.fail(jobId, { outcome: { message: `Failed to configure workspace: ${error instanceof Error ? error.message : 'Unknown error'}`, acknowledge: true } }); ``` This follows the standard Job pattern: **acknowledge → update progress → complete** (or fail on error). This provides users with real-time feedback and ensures robust error handling throughout the configuration process. ## Next Steps Ready to enhance data quality? Continue to [Adding Validation](/coding-tutorial/101-your-first-listener/101.02-adding-validation) to learn how to validate Fields and provide real-time feedback to users. For more detailed information: * Understand Job lifecycle patterns in [Jobs](/core-concepts/jobs) and [Spaces](/core-concepts/spaces) * Learn more about [Events](/reference/events) * Organize your Listeners with [Namespaces](/guides/namespaces-and-filters) # 02: Adding Validation to Your Listener Source: https://flatfile.com/docs/coding-tutorial/101-your-first-listener/101.02-adding-validation Enhance your listener with data validation capabilities to ensure data quality and provide real-time feedback to users. In the [previous guide](/coding-tutorial/101-your-first-listener/101.01-first-listener), we created a Listener that configures Spaces and sets up the data structure. Now we'll add data validation to ensure data quality and provide helpful feedback to users as they work with their data. **Following along?** Download the starting code from our [Getting Started repository](https://github.com/FlatFilers/getting-started/tree/main/101.01-first-listener) and refactor it as we go, or jump directly to the [final version with validation](https://github.com/FlatFilers/getting-started/tree/main/101.02-adding-validation). ## What Is Data Validation? Data validation in Flatfile allows you to: * Check data formats and business rules * Provide warnings and errors to guide users * Ensure data quality before processing * Give real-time feedback during data entry Validation can happen at different levels: * **Field-level**: Validate individual [Field](/core-concepts/fields) values (email format, date ranges, etc.) * **Record-level**: Validate relationships between [Fields](/core-concepts/fields) in a single [Record](/core-concepts/records) * **Sheet-level**: Validate across all [Records](/core-concepts/records) (duplicates, unique constraints, etc.) ## Email Validation Example This example shows how to perform email format validation directly when Records are committed. When users commit their changes, we validate that email addresses have a proper format and provide helpful feedback for any invalid emails. This approach validates Records as they're committed, providing immediate feedback to users. For more complex validations or when you need an object-oriented approach, we recommend using the [Record Hook](/plugins/record-hook) plugin. If you use both [Record Hooks](/plugins/record-hook) and regular listener validators (like this one) on the same sheet, you may encounter race conditions. Record Hooks will clear all existing messages before applying new ones, which can interfere with any messages set elsewhere. We have ways to work around this, but it's a good idea to avoid using both at the same time. ## What Changes We're Making To add validation to our basic Listener, we'll add a listener that triggers when users commit their changes and performs validation directly: ```javascript listener.on("commit:created", async (event) => { const { sheetId } = event.context; // Get committed records and validate email format const response = await api.records.get(sheetId); const records = response.data.records; // Email validation logic here... }); ``` ## Complete Example with Validation Here's how to add email validation to your existing Listener: ```javascript JavaScript import api from "@flatfile/api"; export default function (listener) { // Configure the space when it's created listener.on("job:ready", { job: "space:configure" }, async (event) => { const { jobId, spaceId } = event.context; try { // Acknowledge the job await api.jobs.ack(jobId, { info: "Setting up your workspace...", progress: 10, }); // Create the workbook with sheets await api.workbooks.create({ spaceId, name: "My Workbook", sheets: [ { name: "contacts", slug: "contacts", fields: [ { key: "name", type: "string", label: "Full Name" }, { key: "email", type: "string", label: "Email" }, ], }, ], }); // Update progress await api.jobs.update(jobId, { info: "Workbook created successfully", progress: 75, }); // Complete the job await api.jobs.complete(jobId, { outcome: { message: "Workspace configured successfully!", acknowledge: true, }, }); } catch (error) { console.error("Error configuring space:", error); // Fail the job if something goes wrong await api.jobs.fail(jobId, { outcome: { message: `Failed to configure workspace: ${error.message}`, acknowledge: true, }, }); } }); // Listen for commits and validate email format listener.on("commit:created", async (event) => { const { sheetId } = event.context; try { // Get records from the sheet const response = await api.records.get(sheetId); const records = response.data.records; // Simple email validation regex const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; // Prepare updates for records with invalid emails const updates = []; for (const record of records) { const emailValue = record.values.email?.value; if (emailValue) { const email = emailValue.toLowerCase(); if (!emailRegex.test(email)) { updates.push({ id: record.id, values: { email: { value: email, messages: [ { type: "error", message: "Please enter a valid email address (e.g., user@example.com)", }, ], }, }, }); } } } // Update records with validation messages if (updates.length > 0) { await api.records.update(sheetId, updates); } } catch (error) { console.error("Error during validation:", error); } }); } ``` ```typescript TypeScript import type { FlatfileListener } from "@flatfile/listener"; import api, { Flatfile } from "@flatfile/api"; export default function (listener: FlatfileListener) { // Configure the space when it's created listener.on("job:ready", { job: "space:configure" }, async (event) => { const { jobId, spaceId } = event.context; try { // Acknowledge the job await api.jobs.ack(jobId, { info: "Setting up your workspace...", progress: 10 }); // Create the workbook with sheets await api.workbooks.create({ spaceId, name: "My Workbook", sheets: [ { name: "contacts", slug: "contacts", fields: [ { key: "name", type: "string", label: "Full Name" }, { key: "email", type: "string", label: "Email" }, ], }, ], }); // Update progress await api.jobs.update(jobId, { info: "Workbook created successfully", progress: 75 }); // Complete the job await api.jobs.complete(jobId, { outcome: { message: "Workspace configured successfully!", acknowledge: true } }); } catch (error) { console.error("Error configuring space:", error); // Fail the job if something goes wrong await api.jobs.fail(jobId, { outcome: { message: `Failed to configure workspace: ${error instanceof Error ? error.message : 'Unknown error'}`, acknowledge: true } }); } }); // Listen for commits and validate email format listener.on("commit:created", async (event) => { const { sheetId } = event.context; try { // Get records from the sheet const response = await api.records.get(sheetId); const records = response.data.records; // Simple email validation regex const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; // Prepare updates for records with invalid emails const updates: Flatfile.RecordWithLinks[] = []; for (const record of records) { const emailValue = record.values.email?.value as string; if (emailValue) { const email = emailValue.toLowerCase(); if (!emailRegex.test(email)) { updates.push({ id: record.id, values: { email: { value: email, messages: [{ type: "error", message: "Please enter a valid email address (e.g., user@example.com)", }], }, }, }); } } } // Update records with validation messages if (updates.length > 0) { await api.records.update(sheetId, updates); } } catch (error) { console.error("Error during validation:", error); } }); } ``` **Complete Example**: The full working code for this tutorial step is available in our Getting Started repository: [JavaScript](https://github.com/FlatFilers/getting-started/tree/main/101.02-adding-validation/javascript) | [TypeScript](https://github.com/FlatFilers/getting-started/tree/main/101.02-adding-validation/typescript) ## Testing Your Validation ### Local Development To test your Listener locally, you can use the `flatfile develop` command. This will start a local server that will listen for Events and respond to them, and will also watch for changes to your Listener code and automatically reload the server. ```bash # Run locally with file watching npx flatfile develop ``` ### Step-by-Step Testing After running your listener locally: 1. Create a new space in your Flatfile environment 2. Enter an invalid email address in the Email Field 3. See error messages appear on invalid email Fields 4. Fix the emails and see the error messages disappear ## What Just Happened? Your Listener now handles two key Events: 1. **`space:configure`** - Sets up the data structure 2. **`commit:created`** - Validates email format when users commit changes Here's how the email validation works step by step: ### 1. Listen for Commits This listener triggers whenever users save their changes to any sheet in the workbook. ```javascript JavaScript listener.on("commit:created", async (event) => { const { sheetId } = event.context; ``` ```typescript TypeScript listener.on("commit:created", async (event) => { const { sheetId } = event.context; ``` ### 2. Get the Records We retrieve all records from the sheet to validate them. ```javascript JavaScript const response = await api.records.get(sheetId); const records = response.data.records; ``` ```typescript TypeScript const response = await api.records.get(sheetId); const records = response.data.records; ``` ### 3. Validate Email Format We use a simple regex pattern to check if each email follows the basic `user@domain.com` format. ```javascript JavaScript const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; for (const record of records) { const emailValue = record.values.email?.value; if (emailValue && !emailRegex.test(emailValue.toLowerCase())) { // Add validation error } } ``` ```typescript TypeScript const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; for (const record of records) { const emailValue = record.values.email?.value as string; if (emailValue && !emailRegex.test(emailValue.toLowerCase())) { // Add validation error } } ``` ### 4. Add Error Messages For invalid emails, we create an update that adds an error message to that specific field. ```javascript JavaScript updates.push({ id: record.id, values: { email: { value: email, messages: [{ type: "error", message: "Please enter a valid email address (e.g., user@example.com)", }], }, }, }); ``` ```typescript TypeScript updates.push({ id: record.id, values: { email: { value: email, messages: [{ type: "error", message: "Please enter a valid email address (e.g., user@example.com)", }], }, }, }); ``` You can apply different types of validation messages: * **`info`**: Informational messages (mouseover tooltip) * **`warn`**: Warnings that don't block processing (yellow) * **`error`**: Errors that should be fixed, blocks [Actions](/core-concepts/actions) with the `hasAllValid` constraint (red) ### 5. Update Records Finally, we send all validation messages back to the sheet so users can see the errors. ```javascript JavaScript if (updates.length > 0) { await api.records.update(sheetId, updates); } ``` ```typescript TypeScript if (updates.length > 0) { await api.records.update(sheetId, updates); } ``` ## Next Steps Ready to make your Listener interactive? Continue to [Adding Actions](/coding-tutorial/101-your-first-listener/101.03-adding-actions) to learn how to handle user submissions and create custom workflows. For more detailed information: * Understand Job lifecycle patterns in [Jobs](/core-concepts/jobs) and [Spaces](/core-concepts/spaces) * Learn more about [Events](/reference/events) * Organize your Listeners with [Namespaces](/guides/namespaces-and-filters) * Explore [plugins](/core-concepts/plugins): [Job Handler](/plugins/job-handler) and [Space Configure](/plugins/space-configure) * Check out [Record Hook](/plugins/record-hook) for simpler Field-level validations # 03: Adding Actions to Your Listener Source: https://flatfile.com/docs/coding-tutorial/101-your-first-listener/101.03-adding-actions Build on your basic Listener by adding user Actions to create interactive data processing workflows. In the [previous guides](/coding-tutorial/101-your-first-listener/101.02-adding-validation), we created a Listener with Space configuration and data validation. Now we'll extend that Listener to handle user Actions, allowing users to submit and process their data. **Following along?** Download the starting code from our [Getting Started repository](https://github.com/FlatFilers/getting-started/tree/main/101.02-adding-validation) and refactor it as we go, or jump directly to the [final version with actions](https://github.com/FlatFilers/getting-started/tree/main/101.03-adding-actions). ## What Are Actions? [Actions](/core-concepts/actions) are interactive buttons that appear in the Flatfile interface, allowing users to trigger custom operations on their data. Common Actions include: * **Submit**: Process your data and POST it to your system via API * **Validate**: Run custom validation rules * **Transform**: Apply data transformations * **Export**: Generate reports or exports For more detail on using Actions, see our [Actions](/guides/using-actions) guide. ## What Changes We're Making To add Actions to our Listener with validation, we need to make two specific changes: ### 1. Add Actions Array to Blueprint Definition In the `space:configure` Listener, we'll add an `actions` array to our Workbook creation. This enhances our [Blueprint](/core-concepts/blueprints) to include interactive elements: ```javascript actions: [ { label: "Submit", description: "Send data to destination system", operation: "submitActionForeground", mode: "foreground", }, ] ``` ### 2. Add Action Handler Listener We'll add a new Listener to handle when users click the Submit button: ```javascript listener.on( "job:ready", { job: "workbook:submitActionForeground" }, async (event) => { // Handle the action... } ); ``` ## Complete Example with Actions This example builds on the Listener we created in the [previous tutorials](/coding-tutorial/101-your-first-listener/101.02-adding-validation). It includes the complete functionality: Space configuration, email validation, and Actions. ```javascript JavaScript import api from "@flatfile/api"; export default function (listener) { // Configure the space when it's created listener.on("job:ready", { job: "space:configure" }, async (event) => { const { jobId, spaceId } = event.context; try { // Acknowledge the job await api.jobs.ack(jobId, { info: "Setting up your workspace...", progress: 10, }); // Create the Workbook with Sheets and Actions await api.workbooks.create({ spaceId, name: "My Workbook", sheets: [ { name: "contacts", slug: "contacts", fields: [ { key: "name", type: "string", label: "Full Name" }, { key: "email", type: "string", label: "Email" }, ], }, ], actions: [ { label: "Submit", description: "Send data to destination system", operation: "submitActionForeground", mode: "foreground", primary: true, }, ], }); // Update progress await api.jobs.update(jobId, { info: "Workbook created successfully", progress: 75, }); // Complete the job await api.jobs.complete(jobId, { outcome: { message: "Workspace configured successfully!", acknowledge: true, }, }); } catch (error) { console.error("Error configuring space:", error); // Fail the job if something goes wrong await api.jobs.fail(jobId, { outcome: { message: `Failed to configure workspace: ${error.message}`, acknowledge: true, }, }); } }); // Handle when someone clicks Submit listener.on( "job:ready", { job: "workbook:submitActionForeground" }, async (event) => { const { jobId, workbookId } = event.context; try { // Acknowledge the job await api.jobs.ack(jobId, { info: "Starting data processing...", progress: 10, }); // Get the data const job = await api.jobs.get(jobId); // Update progress await api.jobs.update(jobId, { info: "Retrieving records...", progress: 30, }); // Get the sheets const { data: sheets } = await api.sheets.list({ workbookId }); // Get and count the records const records = {}; let recordsCount = 0; for (const sheet of sheets) { const { data: { records: sheetRecords }, } = await api.records.get(sheet.id); records[sheet.name] = sheetRecords; recordsCount += sheetRecords.length; } // Update progress await api.jobs.update(jobId, { info: `Processing ${sheets.length} sheets with ${recordsCount} records...`, progress: 60, }); // Process the data (log to console for now) console.log("Processing records:", JSON.stringify(records, null, 2)); // Complete the job await api.jobs.complete(jobId, { outcome: { message: `Successfully processed ${sheets.length} sheets with ${recordsCount} records!`, acknowledge: true, }, }); } catch (error) { console.error("Error processing data:", error); // Fail the job if something goes wrong await api.jobs.fail(jobId, { outcome: { message: `Data processing failed: ${error.message}`, acknowledge: true, }, }); } }, ); // Listen for commits and validate email format listener.on("commit:created", async (event) => { const { sheetId } = event.context; try { // Get records from the sheet const response = await api.records.get(sheetId); const records = response.data.records; // Simple email validation regex const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; // Prepare updates for records with invalid emails const updates = []; for (const record of records) { const emailValue = record.values.email?.value; if (emailValue) { const email = emailValue.toLowerCase(); if (!emailRegex.test(email)) { updates.push({ id: record.id, values: { email: { value: email, messages: [ { type: "error", message: "Please enter a valid email address (e.g., user@example.com)", }, ], }, }, }); } } } // Update records with validation messages if (updates.length > 0) { await api.records.update(sheetId, updates); } } catch (error) { console.error("Error during validation:", error); } }); } ``` ```typescript TypeScript import type { FlatfileListener } from "@flatfile/listener"; import api, { Flatfile } from "@flatfile/api"; export default function (listener: FlatfileListener) { // Configure the space when it's created listener.on("job:ready", { job: "space:configure" }, async (event) => { const { jobId, spaceId } = event.context; try { // Acknowledge the job await api.jobs.ack(jobId, { info: "Setting up your workspace...", progress: 10 }); // Create the Workbook with Sheets and Actions await api.workbooks.create({ spaceId, name: "My Workbook", sheets: [ { name: "contacts", slug: "contacts", fields: [ { key: "name", type: "string", label: "Full Name" }, { key: "email", type: "string", label: "Email" }, ], }, ], actions: [ { label: "Submit", description: "Send data to destination system", operation: "submitActionForeground", mode: "foreground", primary: true, }, ], }); // Update progress await api.jobs.update(jobId, { info: "Workbook created successfully", progress: 75 }); // Complete the job await api.jobs.complete(jobId, { outcome: { message: "Workspace configured successfully!", acknowledge: true } }); } catch (error) { console.error("Error configuring space:", error); // Fail the job if something goes wrong await api.jobs.fail(jobId, { outcome: { message: `Failed to configure workspace: ${error instanceof Error ? error.message : 'Unknown error'}`, acknowledge: true } }); } }); // Handle when someone clicks Submit listener.on( "job:ready", { job: "workbook:submitActionForeground" }, async (event) => { const { jobId, workbookId } = event.context; try { // Acknowledge the job await api.jobs.ack(jobId, { info: "Starting data processing...", progress: 10 }); // Get the data const job = await api.jobs.get(jobId); // Update progress await api.jobs.update(jobId, { info: "Retrieving records...", progress: 30 }); // Get the sheets const { data: sheets } = await api.sheets.list({ workbookId }); // Get and count the records const records: { [name: string]: any[] } = {}; let recordsCount = 0; for (const sheet of sheets) { const { data: { records: sheetRecords}} = await api.records.get(sheet.id); records[sheet.name] = sheetRecords; recordsCount += sheetRecords.length; } // Update progress await api.jobs.update(jobId, { info: `Processing ${sheets.length} sheets with ${recordsCount} records...`, progress: 60 }); // Process the data (log to console for now) console.log("Processing records:", JSON.stringify(records, null, 2)); // Complete the job await api.jobs.complete(jobId, { outcome: { message: `Successfully processed ${sheets.length} sheets with ${recordsCount} records!`, acknowledge: true } }); } catch (error) { console.error("Error processing data:", error); // Fail the job if something goes wrong await api.jobs.fail(jobId, { outcome: { message: `Data processing failed: ${error instanceof Error ? error.message : 'Unknown error'}`, acknowledge: true } }); } } ); // Listen for commits and validate email format listener.on("commit:created", async (event) => { const { sheetId } = event.context; try { // Get records from the sheet const response = await api.records.get(sheetId); const records = response.data.records; // Simple email validation regex const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; // Prepare updates for records with invalid emails const updates: Flatfile.RecordWithLinks[] = []; for (const record of records) { const emailValue = record.values.email?.value as string; if (emailValue) { const email = emailValue.toLowerCase(); if (!emailRegex.test(email)) { updates.push({ id: record.id, values: { email: { value: email, messages: [{ type: "error", message: "Please enter a valid email address (e.g., user@example.com)", }], }, }, }); } } } // Update records with validation messages if (updates.length > 0) { await api.records.update(sheetId, updates); } } catch (error) { console.error("Error during validation:", error); } }); } ``` **Complete Example**: The full working code for this tutorial step is available in our Getting Started repository: [JavaScript](https://github.com/FlatFilers/getting-started/tree/main/101.03-adding-actions/javascript) | [TypeScript](https://github.com/FlatFilers/getting-started/tree/main/101.03-adding-actions/typescript) ## Understanding Action Modes Actions can run in different modes: * **`foreground`**: Runs immediately with real-time progress updates (good for quick operations) * **`background`**: Runs as a background job (good for longer operations) The Action operation name (`submitActionForeground`) determines which Listener will handle the Action. ## Testing Your Action ### Local Development To test your Listener locally, you can use the `flatfile develop` command. This will start a local server that will listen for Events and respond to them, and will also watch for changes to your Listener code and automatically reload the server. ```bash # Run locally with file watching npx flatfile develop ``` ### Step-by-Step Testing After running your listener locally: 1. Create a new Space in your Flatfile Environment 2. Upload (or manually enter) some data to the contacts Sheet with both valid and invalid email addresses 3. See validation errors appear on invalid email Fields 4. Click the "Submit" button 5. Watch the logging in the terminal as your data is processed and the job is completed ## What Just Happened? Your Listener now handles three key Events and provides a complete data import workflow. Here's how the new action handling works: ### 1. Blueprint Definition with Actions We enhanced the [Blueprint](/core-concepts/blueprints) definition to include action buttons that users can interact with. Adding actions to your workbook configuration is part of defining your Blueprint. ```javascript JavaScript actions: [ { label: "Submit", description: "Send data to destination system", operation: "submitActionForeground", mode: "foreground", primary: true, }, ] ``` ```typescript TypeScript actions: [ { label: "Submit", description: "Send data to destination system", operation: "submitActionForeground", mode: "foreground", primary: true, }, ] ``` ### 2. Listen for Action Events When users click the Submit button, Flatfile triggers a [Job](/core-concepts/jobs) that your listener can handle using the same approach we used for the `space:configure` job in [101.01](/coding-tutorial/101-your-first-listener/101.01-first-listener#2-listen-for-space-configuration). Jobs are named with the pattern `:`. In this case, the domain is `workbook` since we've mounted the Action to the Workbook blueprint, and the operation is `submitActionForeground` as defined in the Action definition. ```javascript JavaScript listener.on( "job:ready", { job: "workbook:submitActionForeground" }, async (event) => { const { jobId, workbookId } = event.context; ``` ```typescript TypeScript listener.on( "job:ready", { job: "workbook:submitActionForeground" }, async (event) => { const { jobId, workbookId } = event.context; ``` ### 3. Retrieve and Process Data Get all the data from the workbook and process it according to your business logic. ```javascript JavaScript // Get the sheets const { data: sheets } = await api.sheets.list({ workbookId }); // Get and count the records const records = {}; let recordsCount = 0; for (const sheet of sheets) { const { data: { records: sheetRecords } } = await api.records.get(sheet.id); records[sheet.name] = sheetRecords; recordsCount += sheetRecords.length; } ``` ```typescript TypeScript // Get the sheets const { data: sheets } = await api.sheets.list({ workbookId }); // Get and count the records const records: { [name: string]: any[] } = {}; let recordsCount = 0; for (const sheet of sheets) { const { data: { records: sheetRecords }} = await api.records.get(sheet.id); records[sheet.name] = sheetRecords; recordsCount += sheetRecords.length; } ``` ### 4. Provide User Feedback Keep users informed about the processing with progress updates and final results. ```javascript JavaScript // Update progress during processing await api.jobs.update(jobId, { info: `Processing ${sheets.length} sheets with ${recordsCount} records...`, progress: 60, }); // Complete with success message await api.jobs.complete(jobId, { outcome: { message: `Successfully processed ${sheets.length} sheets with ${recordsCount} records!`, acknowledge: true, }, }); ``` ```typescript TypeScript // Update progress during processing await api.jobs.update(jobId, { info: `Processing ${sheets.length} sheets with ${recordsCount} records...`, progress: 60 }); // Complete with success message await api.jobs.complete(jobId, { outcome: { message: `Successfully processed ${sheets.length} sheets with ${recordsCount} records!`, acknowledge: true } }); ``` Your complete Listener now handles: * **`space:configure`** - Defines the Blueprint with interactive actions * **`commit:created`** - Validates email format when users commit changes * **`workbook:submitActionForeground`** - Processes data when users click Submit The Action follows the same Job lifecycle pattern: **acknowledge → update progress → complete** (or fail on error). This provides users with real-time feedback during data processing, while validation ensures data quality throughout the import process. ## Next Steps Congratulations! You now have a complete Listener that handles Space configuration, data validation, and user Actions. For more detailed information: * Learn more about [Actions](/guides/using-actions) * Understand Job lifecycle patterns in [Jobs](/core-concepts/jobs) * Learn more about [Events](/reference/events) * Organize your Listeners with [Namespaces](/guides/namespaces-and-filters) * Explore [plugins](/core-concepts/plugins): [Job Handler](/plugins/job-handler) and [Space Configure](/plugins/space-configure) # Coding Tutorial Source: https://flatfile.com/docs/coding-tutorial/overview Learn to build Event Listeners that configure and customize Flatfile through hands-on coding tutorials ## Events Drive Everything in Flatfile **When you configure Flatfile with code, you're building part of an event-driven system.** Every time a user interacts with their data—editing a Record, uploading a file, clicking an [Action](/core-concepts/actions) button, etc.—the Flatfile Platform emits an **Event**. Your custom **Listeners**, defined in your code and generally deployed to the Flatfile Platform as an [Agent](/core-concepts/listeners#agents), respond to these Events by executing whatever logic you define. **Out of the box, Flatfile is essentially an empty slate.** Everything you see—[Workbooks](/core-concepts/workbooks), [Sheets](/core-concepts/sheets), [Fields](/core-concepts/fields), validation logic—is created by your Listeners responding to Events. This logic includes configuring your [Blueprint](/core-concepts/blueprints) in a `space:configure` job, and may be as simple as logging an Event to the console, or as complex as processing records against existing data from your own API. But given the Event/Listener pattern as basic building blocks, you can build pretty much anything your imagination can conjure. ```mermaid sequenceDiagram participant User participant Platform as Flatfile Platform participant Listener as Listener Logic User->>Platform: Edits data Platform->>Platform: Internal processing Platform->>Listener: Emits "commit:created" Listener->>Listener: Executes your code ``` This event-driven paradigm is fundamentally different from traditional "configuration", where you pre-define static options. Instead, you write reactive code that responds to what users actually do. Events carry structured information about what happened, including the context and any relevant data, giving your Listeners everything they need to respond intelligently. Common Events include: * `commit:created` - When a commit is created (after records are added or modified) * `job:ready` - When [Jobs](/core-concepts/jobs) are ready for execution * `file:created` - When a file is uploaded This is just a very small sample of the Events emitted by Flatfile. For a complete list, see the [Events Reference](/reference/events). Your Listeners define how Flatfile behaves. Want to validate data when records are modified? Listen for `commit:created` Events and perform your validation in the callback. Need to process uploaded files? Listen for `file:created` Events. Want to trigger discrete units of work? Listen for `job:ready` Events (generally filtered by `{job: "domain:operation"}`, but we'll cover that later). For a more comprehensive understanding of this topic, see [Events and Listeners](/core-concepts/listeners). ## Learning Path: Building Custom Listeners This tutorial series teaches you to build Event Listeners through hands-on coding. You'll start with basic Space configuration and progressively add more sophisticated functionality. **Prerequisites**: * Basic JavaScript or TypeScript knowledge * Node.js and npm installed (latest LTS version recommended) You may also find it helpful to review the [Core Concepts](/core-concepts/overview) documentation to get a better understanding of the structures and terminology used in this tutorial, or just jump right into things and keep the Core Concepts documentation in mind in case you come across something you don't understand. You might even keep it open in another tab for handy reference later on. **Getting Started**: This tutorial is designed to be completed from scratch from an empty folder. You'll build everything step-by-step as you follow along. **Reference Code**: If you'd like to skip ahead or you'd like a complete reference, you can find complete working examples in our [Getting Started repository](https://github.com/FlatFilers/getting-started/) with both JavaScript and TypeScript versions. ### Listeners 101: Your First Listener A three-part series that builds a complete Listener from scratch. By the end of this series, you'll have a single-page Listener that configures Spaces, creates Workbooks and Sheets, validates a single field, and adds a custom Action. This is a great way to learn the basics of Listener development and get a feel for how to build your own Listeners, but it's meant to be a starting point; In the next series, we'll show you how to make your Listeners more concise and structured to prepare for more complex use cases. Set up your development environment and build a Listener that configures Spaces, creates Workbooks and Sheets, and manages the Job lifecycle. You'll learn the foundational patterns for all Listener development. Add data validation to ensure Record quality. You'll learn to validate individual Fields, provide helpful error messages, and guide users toward clean data. Extend your Listener with custom Actions that let users trigger your code on demand. You'll learn to create interactive workflows and provide user feedback. # Actions Source: https://flatfile.com/docs/core-concepts/actions User-triggered operations in Flatfile An Action is a code-based operation that runs when a user clicks a button or menu item in Flatfile. Actions can be mounted on [Sheets](/core-concepts/sheets), [Workbooks](/core-concepts/workbooks), [Documents](/core-concepts/documents), or Files to trigger custom operations. Defining a custom Action is a two-step process: 1. Define an Action in your Flatfile blueprint or in your code 2. Create a [Listener](/core-concepts/listeners) to handle the Action When an Action is triggered, it creates a [Job](/core-concepts/jobs) that your application can listen for and respond to. Given that Actions are powered by Jobs, the [Jobs Lifecycle](/core-concepts/jobs#jobs-lifecycle) pertains to Actions as well. This means that you can [update progress values/messages](/core-concepts/jobs#updating-job-progress) while an Action is processing, and when it's done you can provide an [Outcome](/core-concepts/jobs#job-outcomes), which allows you to show a success message, automatically [download a generated file](/core-concepts/jobs#file-downloads), or [forward the user](/core-concepts/jobs#internal-navigation) to a generated Document. For complete implementation details, see our [Using Actions guide](/guides/using-actions). ## Types of Actions ### Built-in Actions Resources in Flatfile come with severaldefault built-in actions like: * Export/download data * Delete data or files * Find and replace (Sheets) ### Developer-Created Actions You can create custom Actions to handle operations specific to your workflow, such as: * Sending data to your API when data is ready * Downloading your data in a specific format * Validating data against external systems * Moving data between different resources * Custom data validations andtransformations ## Where Actions Appear Actions appear in different parts of the UI depending on where they're mounted: * **Workbook Actions**: Buttons in the top-right corner of Workbooks * **Sheet Actions**: Dropdown menu in the Sheet toolbar (or top-level button if marked as `primary`) * **Document Actions**: Buttons in the top-right corner of Documents * **File Actions**: Dropdown menu for each file in the Files list ## Example Action Configuration Every Action requires an `operation` (unique identifier) and `label` (display text): ```javascript { operation: "submitActionBg", mode: "background", label: "Submit", type: "string", description: "Submit this data to a webhook.", primary: true, }, ``` Actions support additional options like `primary` status, confirmation dialogs, constraints, and input forms. See the [Using Actions guide](/guides/using-actions) for more details. # Apps Source: https://flatfile.com/docs/core-concepts/apps The anatomy of an App ## Apps Apps are an organizational unit in Flatfile, designed to manage and coordinate data import workflows across different environments. They serve as containers for organizing related Spaces and provide a consistent configuration that can be deployed across your development pipeline. Apps can be given [namespaces](/guides/namespaces-and-filters#app-namespaces) to isolate different parts of your application and control which [listeners](/core-concepts/listeners) receive events from which spaces. Apps are available across Development-level environments by default, and optionally available across Production environments with a configuration option: # Blueprints Source: https://flatfile.com/docs/core-concepts/blueprints Define your data schema to structure exactly how data should look, behave, and connect ## What is a Blueprint? Blueprints enable you to create repeatable, reliable data import experiences that scale with your needs while maintaining data quality and user experience standards. A Blueprint is your complete data definition in Flatfile. It controls how your data should look, behave, and connect—from simple field validations (like `unique` and `required`) to complex [relationships](/core-concepts/fields#reference) between [sheets](/core-concepts/sheets). You can even create [filtered reference fields](/core-concepts/fields#reference-field-filtering) that dynamically control available dropdown options based on other field values. Think of it as an intelligent template that ensures you collect the right data in the right format, every time. **Terminology Note**: "Blueprint" is Flatfile's term for what might be called a "schema" in other systems. Throughout Flatfile's documentation and API, we use "Blueprint" as the standard term for data structure definitions to distinguish Flatfile's comprehensive data modeling approach from generic schema concepts. ## How Blueprints Work Every [Space](/core-concepts/spaces) has exactly one Blueprint that defines its data structure. Whenever a new space is created, the Flatfile Platform automatically triggers a `space:configure` [Job](/core-concepts/jobs), and you can configure a [Listener](/core-concepts/listeners) to pick up that job and configure the new space by defining its Blueprint. Creating workbooks, sheets, and actions **is** your Blueprint definition, establishing the data schema that will govern all data within that Space. To make that part easier, we have provided the [Space Configure Plugin](/plugins/space-configure) to abstract away the Job/Listener code, allowing you to focus on what matters: Preparing your space for data. ## Basic Blueprint Structure * A [Blueprint](/core-concepts/blueprints) defines the data structure for any number of [Spaces](/core-concepts/spaces) * A [Space](/core-concepts/blueprints) may contain many [Workbooks](/core-concepts/workbooks) and many [Documents](/core-concepts/documents) * A [Document](/core-concepts/documents) contains static documentation and may contain many [Document-level Actions](/guides/using-actions#document-actions) * A [Workbook](/core-concepts/workbooks) may contain many [Sheets](/core-concepts/sheets) and many [Workbook-level Actions](/guides/using-actions#workbook-actions) * A [Sheet](/core-concepts/sheets) may contain many [Fields](/core-concepts/fields) and many [Sheet-level Actions](/guides/using-actions#sheet-actions) * A [Field](/core-concepts/fields) defines a single column of data, and may contain many [Field-level Actions](/guides/using-actions#field-actions) **A note about Actions:** Actions also require a listener to respond to the event published by clicking on them. For more, see [Using Actions](/guides/using-actions) ## Example Blueprint Configuration **Recommendation:** Although throughout the documentation we'll be explicitly defining each level of a blueprint, it's important to note that you can split each of your **Workbooks**, **Sheets**, **Documents**, and **Actions** definitions into separate files and import them. Then your Workbook blueprint can be as simple as: ```javascript const companyWorkbook = { name: "Company Workbook", documents: [dataProcessingSteps] sheets: [usersSheet], actions: [exportToCRM], }; ``` This leads to a more maintainable codebase, and the modularity opens the door for code reuse. For instance, you'll be able to use `usersSheet.slug` in your listener code to filter or differentiate between sheets, or re-use `exportToSCRM` in any other workbook that needs to export data to a CRM. This example shows a Blueprint definition for [Space configuration](/core-concepts/spaces#space-configuration). It creates a single [Workbook](/core-concepts/workbooks) with a single [Document](/core-concepts/documents) and a single [Sheet](/core-concepts/sheets) containing two [Fields](/core-concepts/fields) and one [Action](/core-concepts/actions). ```javascript const workbooks = [{ name: "Company Workbook", documents: [ { title: "Data Processing Walkthrough", body: "1. Add Data\n2. Process Data\n3. Export Data", actions: [ { operation: "confirm", label: "Confirm", type: "string", primary: true, }, ], }, ], sheets: [ { name: "Users", slug: "users", fields: [ { key: "fname", type: "string", label: "First Name", }, { key: "lname", type: "string", label: "Last Name", }, ], actions: [ { operation: "validate-inventory", mode: "background", label: "Validate Inventory", description: "Check product availability against inventory system", }, ], }, ], actions: [ { operation: "export-to-crm", mode: "foreground", label: "Export to CRM", description: "Send validated customers to Salesforce", }, ], }]; ``` ## Workbook Folders and Sheet Collections Although they have no impact on your data itself or its structure, [Workbook Folders](/core-concepts/workbooks#folders) and [Sheet Collections](/core-concepts/sheets#collections) are a powerful way to organize your data in the Flatfile UI. They are essentially named labels that you assign to your Workbooks and Sheets, which the Flatfile UI interprets to group them together (and apart from others). You can define them directly in your [Blueprint](/core-concepts/blueprints) when [configuring your Space](/core-concepts/spaces#space-configuration) or when otherwise creating or updating a Workbook or Sheet via the [API](https://reference.flatfile.com). You can think of **Folders** and **Collections** like a filing system: * [Folders](/core-concepts/workbooks#folders) help you organize your Workbooks within a Space (like organizing binders on a shelf). * [Collections](/core-concepts/sheets#collections) help you organize Sheets within each Workbook (like organizing tabs within a binder). This is a great way to declutter your Sidebar and keep your data organized and easy to find in the Flatfile UI. In the following example, we have several Workbooks grouped into two Folders: * **Analytics** (folded) * **Business Operations** (unfolded) The **Business Operations** Workbooks each contain several Sheets grouped into Collections: * **Compensation** and **Personel** * **Stock Management** and **Vendor Management** ```javascript const salesReportWorkbook = { name: "Sales Analytics", folder: "Analytics", sheets: [ // Source Data collection (2 sheets) salesDataSheet, revenueSheet, // Analytics collection (2 sheets) campaignMetricsSheet, leadSourcesSheet ] }; const humanResourcesWorkbook = { name: "Human Resources Management", folder: "Business Operations", sheets: [ // Personnel collection (2 sheets) employeesSheet, departmentsSheet, // Compensation collection (2 sheets) payrollSheet, benefitsSheet ] }; const operationsWorkbook = { name: "Operations Management", folder: "Business Operations", sheets: [ // Stock Management collection (2 sheets) inventorySheet, warehousesSheet, // Vendor Management collection (2 sheets) suppliersSheet, purchaseOrdersSheet ] }; ``` ```javascript const salesDataSheet = { name: "Sales Data", collection: "Source Data", fields: [ { key: "name", type: "string", label: "Customer Name" }, { key: "email", type: "string", label: "Email Address" } ] }; const revenueSheet = { name: "Revenue", collection: "Analytics", fields: [ { key: "revenue", type: "number", label: "Revenue" } ] }; const campaignMetricsSheet = { name: "Campaign Metrics", collection: "Analytics", fields: [ { key: "impressions", type: "number", label: "Impressions" }, { key: "clicks", type: "number", label: "Clicks" } ] }; const leadSourcesSheet = { name: "Lead Sources", collection: "Analytics", fields: [ { key: "source", type: "string", label: "Source" }, { key: "conversion_rate", type: "number", label: "Conversion Rate" } ] }; const employeesSheet = { name: "Employees", collection: "Personnel", fields: [ { key: "employee_id", type: "string", label: "Employee ID" }, { key: "name", type: "string", label: "Full Name" }, { key: "department", type: "string", label: "Department" }, { key: "hire_date", type: "date", label: "Hire Date" } ] }; const departmentsSheet = { name: "Departments", collection: "Personnel", fields: [ { key: "dept_code", type: "string", label: "Department Code" }, { key: "dept_name", type: "string", label: "Department Name" }, { key: "manager", type: "string", label: "Manager" } ] }; const positionsSheet = { name: "Job Positions", collection: "Personnel", fields: [ { key: "position_id", type: "string", label: "Position ID" }, { key: "title", type: "string", label: "Job Title" }, { key: "level", type: "string", label: "Job Level" }, { key: "department", type: "string", label: "Department" } ] }; const payrollSheet = { name: "Payroll", collection: "Compensation", fields: [ { key: "employee_id", type: "string", label: "Employee ID" }, { key: "salary", type: "number", label: "Annual Salary" }, { key: "bonus", type: "number", label: "Bonus" } ] }; const benefitsSheet = { name: "Benefits", collection: "Compensation", fields: [ { key: "benefit_type", type: "string", label: "Benefit Type" }, { key: "cost", type: "number", label: "Monthly Cost" }, { key: "coverage", type: "string", label: "Coverage Level" } ] }; const bonusesSheet = { name: "Performance Bonuses", collection: "Compensation", fields: [ { key: "employee_id", type: "string", label: "Employee ID" }, { key: "performance_rating", type: "string", label: "Performance Rating" }, { key: "bonus_amount", type: "number", label: "Bonus Amount" }, { key: "quarter", type: "string", label: "Quarter" } ] }; const attendanceSheet = { name: "Attendance", collection: "Time Tracking", fields: [ { key: "employee_id", type: "string", label: "Employee ID" }, { key: "date", type: "date", label: "Date" }, { key: "hours_worked", type: "number", label: "Hours Worked" }, { key: "overtime", type: "number", label: "Overtime Hours" } ] }; const leaveRequestsSheet = { name: "Leave Requests", collection: "Time Tracking", fields: [ { key: "request_id", type: "string", label: "Request ID" }, { key: "employee_id", type: "string", label: "Employee ID" }, { key: "leave_type", type: "string", label: "Leave Type" }, { key: "start_date", type: "date", label: "Start Date" }, { key: "end_date", type: "date", label: "End Date" } ] }; const inventorySheet = { name: "Inventory", collection: "Stock Management", fields: [ { key: "sku", type: "string", label: "SKU" }, { key: "product_name", type: "string", label: "Product Name" }, { key: "quantity", type: "number", label: "Quantity in Stock" }, { key: "reorder_level", type: "number", label: "Reorder Level" } ] }; const warehousesSheet = { name: "Warehouses", collection: "Stock Management", fields: [ { key: "warehouse_id", type: "string", label: "Warehouse ID" }, { key: "location", type: "string", label: "Location" }, { key: "capacity", type: "number", label: "Storage Capacity" }, { key: "manager", type: "string", label: "Warehouse Manager" } ] }; const stockMovementsSheet = { name: "Stock Movements", collection: "Stock Management", fields: [ { key: "movement_id", type: "string", label: "Movement ID" }, { key: "sku", type: "string", label: "SKU" }, { key: "quantity", type: "number", label: "Quantity" }, { key: "movement_type", type: "string", label: "Movement Type" }, { key: "date", type: "date", label: "Date" } ] }; const suppliersSheet = { name: "Suppliers", collection: "Vendor Management", fields: [ { key: "supplier_id", type: "string", label: "Supplier ID" }, { key: "company_name", type: "string", label: "Company Name" }, { key: "contact_person", type: "string", label: "Contact Person" }, { key: "email", type: "string", label: "Email" } ] }; const purchaseOrdersSheet = { name: "Purchase Orders", collection: "Vendor Management", fields: [ { key: "order_id", type: "string", label: "Order ID" }, { key: "supplier_id", type: "string", label: "Supplier ID" }, { key: "order_date", type: "date", label: "Order Date" }, { key: "total_amount", type: "number", label: "Total Amount" } ] }; const vendorPerformanceSheet = { name: "Vendor Performance", collection: "Vendor Management", fields: [ { key: "supplier_id", type: "string", label: "Supplier ID" }, { key: "on_time_delivery", type: "number", label: "On-Time Delivery %" }, { key: "quality_rating", type: "number", label: "Quality Rating" }, { key: "cost_competitiveness", type: "number", label: "Cost Rating" } ] }; ``` # Documents Source: https://flatfile.com/docs/core-concepts/documents Standalone webpages within Flatfile Spaces for guidance and dynamic content Documents are standalone webpages for your Flatfile [Spaces](/core-concepts/spaces). They can be rendered from [Markdown syntax](https://www.markdownguide.org/basic-syntax/). Often used for getting started guides, Documents become extremely powerful with dynamically generated content that stays updated as Events occur. Flatfile also allows you to use HTML tags in your Markdown-formatted text. This is helpful if you prefer certain HTML tags rather than Markdown syntax. Links in documents (both Markdown and HTML) automatically open in a new tab to ensure users don't navigate away from the Flatfile interface. ## Key Features **A note on Documents:** While Documents themselves can be created and updated [dynamically](/core-concepts/documents#dynamic-content), the content inside of a document should be considered to be *static* - that is, you cannot use documents to host interactive elements or single-page webforms. For that sort of functionality, we recommend using [Actions](/core-concepts/actions) to trigger a [Listener](/core-concepts/listeners) to perform the desired functionality. ### Markdown-Based Content Documents support GitHub-flavored Markdown, allowing you to create rich, formatted content with headers, lists, code blocks, and more. You can also use HTML tags within your Markdown for additional formatting flexibility. ### Dynamic Content Documents can be created and updated programmatically in response to Events, enabling dynamic content that reflects the current state of your Space or data processing workflow. ### Document Actions Add interactive buttons to your Documents that trigger custom operations. [Actions](/core-concepts/actions) appear in the top right corner and can be configured with different modes, confirmations, and tooltips. ### Embedded Blocks Documents support embedding interactive data blocks (Workbooks, Sheets, and Diffs) directly within the content. See the [Adding Blocks to Documents](#adding-blocks-to-documents) section for detailed implementation. ## Create a Document You can create Documents upon Space creation using the [Space Configure Plugin](/plugins/space-configure), or dynamically in a [Listener](/core-concepts/listeners) using the API: ```javascript import api from "@flatfile/api"; export default function flatfileEventListener(listener) { listener.on("file:created", async ({ context: { spaceId, fileId } }) => { const fileName = (await api.files.get(fileId)).data.name; const bodyText = "# Welcome\n" + "### Say hello to your first customer Space in the new Flatfile!\n" + "Let's begin by first getting acquainted with what you're seeing in your Space initially.\n" + "---\n" + "Your uploaded file, ${fileName}, is located in the Files area."; const doc = await api.documents.create(spaceId, { title: "Getting Started", body: bodyText, }); }); } ``` This Document will now appear in the sidebar of your Space. Learn how to [customize the guest sidebar](/guides/customize-guest-sidebar) for different user types. In this example, we create a Document when a file is uploaded, but you can also create Documents in response to any other Event. [Read more](/reference/events) about the different Events you can respond to. ## Document Actions Actions are optional and allow you to run custom operations in response to a user-triggered event from within a Document. Define Actions on a Document using the `actions` parameter when a document is created: ```javascript import api from "@flatfile/api"; export default function flatfileEventListener(listener) { listener.on("file:created", async ({ context: { spaceId, fileId } }) => { const fileName = (await api.files.get(fileId)).data.name; const bodyText = "# Welcome\n" + "### Say hello to your first customer Space in the new Flatfile!\n" + "Let's begin by first getting acquainted with what you're seeing in your Space initially.\n" + "---\n" + "Your uploaded file, ${fileName}, is located in the Files area."; const doc = await api.documents.create(spaceId, { title: "Getting Started", body: bodyText, actions: [ { label: "Submit", operation: "contacts:submit", description: "Would you like to submit the contact data?", tooltip: "Submit the contact data", mode: "foreground", primary: true, confirm: true, }, ], }); }); } ``` Then configure your listener to handle this Action, and define what should happen in response. Read more about Actions and how to handle them in our [Using Actions guide](/guides/using-actions). Actions appear as buttons in the top right corner of your Document. ## Document treatments Documents have an optional `treatments` parameter which takes an array of treatments for your Document. Treatments can be used to categorize your Document. Certain treatments will cause your Document to look or behave differently. ### Ephemeral documents Giving your Document a treatment of `"ephemeral"` will cause the Document to appear as a full-screen takeover, and it will not appear in the sidebar of your Space like other Documents. You can use ephemeral Documents to create a more focused experience for your end users. ```javascript const ephemeralDoc = await api.documents.create(spaceId, { title: "Getting started", body: "# Welcome ...", treatments: ["ephemeral"], }); ``` Currently, `"ephemeral"` is the only treatment that will change the behavior of your Document. ## Adding Blocks to Documents Blocks are dynamic, embedded entities that you can use to display data inside a Document. You can add a Block to a Document using the `` HTML entity in your markdown and specifying which Block type you want to show using the `type` attribute on the entity. Three Block types are currently supported: Embedded Workbook, Embedded Sheet, and Embedded Diff. ### Embedded Workbook Use this Block to render an entire Workbook with all its Sheets inside a Document, providing users with tabbed navigation between sheets. You can embed a Workbook by passing a workbook ID and optional name. You can also control whether the embedded Workbook is expanded when the document loads and whether to show the header. ```javascript const doc = await api.documents.create(spaceId, { title: "Getting started", body: "# Welcome\n" + "\n" + "Here is an embedded Workbook:\n" + "\n" + "\n" + "\n" + "Here is another embedded Workbook without header:\n" + "\n" + "", }); ``` **Properties:** * `workbookId` (required): The ID of the workbook to embed * `name` (optional): Display name for the embedded workbook * `defaultExpanded` (optional): Whether the workbook is expanded when the document loads (defaults to false) * `showHeader` (optional): Whether to show the workbook header (defaults to true). When false, the workbook is automatically expanded ### Embedded Sheet Use this Block to render a Sheet along with all its data inside of a Document. You can embed a Sheet into your Document by passing a sheet ID, workbook ID, and name. You can also specify whether the embedded Sheet is expanded or collapsed when the document is loaded, and whether to show the header. You can include as many embedded Sheets in your Document as you like, but end users will only be able to expand a maximum of 10 embedded Sheets at once. ```javascript const doc = await api.documents.create(spaceId, { title: "Getting started", body: "# Welcome\n" + "\n" + "Here is an embedded Sheet:\n" + "\n" + "\n" + "\n" + "Here is another embedded Sheet without header:\n" + "\n" + "", }); ``` **Properties:** * `sheetId` (required): The ID of the sheet to embed * `workbookId` (required): The ID of the workbook containing the sheet * `name` (optional): Display name for the embedded sheet * `defaultExpanded` (optional): Whether the sheet is expanded when the document loads (defaults to false) * `showHeader` (optional): Whether to show the sheet header (defaults to true). When false, the sheet is automatically expanded ### Embedded Diff Use this Block to show a side-by-side comparison of the data in a Sheet now versus at a previous point in time as captured by a Snapshot. Pass a Sheet ID, Workbook ID, and Snapshot ID. You can optionally pass a `direction` attribute which specified whether the changes are displayed with the Snapshot as the end state (`sheet_to_snapshot`) or the Sheet as the end state (`snapshot_to_sheet`). The default value for direction is `sheet_to_snapshot`. Use `direction="sheet_to_snapshot"` if you want to show changes that have been made since the time the Snapshot was taken, i.e. to review past changes. Use `direction="snapshot_to_sheet"` if to preview the changes that would occur if you were to revert your Sheet back to the state it was in when the Snapshot was taken. ```javascript const doc = await api.documents.create(spaceId, { title: 'Getting started', body: "# Welcome\n" + "\n" + "Here is an embedded Diff:\n" + "\n" + " **Dynamic Enums** This feature, along with the `ENUM_REFERENCE` [sheet treatment](/core-concepts/sheets#reference-sheets), may be collectively referred to as **Dynamic Enums**. By combining these two features, you can create a drop-down list for any cell in your sheet that's dynamically controlled by the value of another field in the same record – and to the end-user, it will just work like a dynamically-configured `enum` field. **Filter Configuration** The `filter` property accepts a `ReferenceFilter` object with two required properties: | Property | Type | Description | | ------------- | -------- | ------------------------------------------------------ | | `refField` | `string` | The field key in the referenced sheet to filter with | | `recordField` | `string` | The field key in the current record used for filtering | **How Filtering Works** When a filter is applied: 1. The system looks at the value in the `recordField` of the current record 2. It then filters the referenced sheet to only show records where `refField` matches that value 3. Only the filtered records become available as options in the reference field **Example: Country and State Cascading Dropdown** Consider a scenario where you want state options to be filtered based on the selected country. This example also demonstrates the use of the `ENUM_REFERENCE` [sheet treatment](/core-concepts/sheets#reference-sheets) to hide the reference sheet from the UI. You may wish to disable this treatment for testing purposes. ```json { "sheets": [ { "name": "Reference Data", "slug": "ref-data", "treatments": ["ENUM_REFERENCE"], "fields": [ { "key": "country-name", "label": "Country", "type": "string" }, { "key": "state-name", "label": "State/Province", "type": "string" } ] }, { "name": "Addresses", "slug": "addresses", "fields": [ { "key": "country", "label": "Country", "type": "reference", "config": { "ref": "ref-data", "key": "country-name" } }, { "key": "state", "label": "State/Province", "type": "reference", "config": { "ref": "ref-data", "key": "state-name", "filter": { "refField": "country-name", "recordField": "country" } } } ] } ] } ``` **Reference Data Sheet**: | country-name | state-name | | ------------ | ---------- | | USA | California | | USA | New York | | USA | Texas | | Canada | Ontario | | Canada | Quebec | **Behavior** * When **USA** is selected in the country field, the state dropdown will only show **California**, **New York**, and **Texas** * When **Canada** is selected in the country field, the state dropdown will only show **Ontario** and **Quebec** The following diagram illustrates how reference field filtering works: ```mermaid graph TD A[User selects **USA**] --> B[System filters references] B --> C[Show only States where **country-name** = **USA**] C --> D[**California**, **New York**, and **Texas** are available] E[User selects **Canada**] --> F[System filters references] F --> G[Show only States where **country-name** = **Canada**] G --> H[**Ontario** and **Quebec** are available] style A stroke:#4CD95E, stroke-width:2px style E stroke:#4CD95E, stroke-width:2px style D stroke:#D9804E, stroke-width:2px style H stroke:#D9804E, stroke-width:2px ``` And this is how it looks in the UI: ![Reference Field Filtering](https://mintlify.s3.us-west-1.amazonaws.com/flatfileinc/core-concepts/assets/usa-filter.png) ![Reference Field Filtering](https://mintlify.s3.us-west-1.amazonaws.com/flatfileinc/core-concepts/assets/canada-filter.png) ![Reference Field Filtering](https://mintlify.s3.us-west-1.amazonaws.com/flatfileinc/core-concepts/assets/invalid-filter.png) ### `reference-list` Defines multiple references to records in another sheet within the same workbook. **config.ref** The sheet slug of the referenced field. Must be in the same workbook. **config.key** The key of the property to use as the reference key. **config.filter** Optional filter to narrow the set of records in the reference sheet used as valid values. ```json { "key": "authors", "type": "reference-list", "label": "Book Authors", "config": { "ref": "authors", "key": "name" } } ``` #### Reference List Filtering The `filter` property on `reference-list` fields works identically to `reference` [field filters](#reference-field-filtering), but allows for multiple selections. ```json { "key": "categories", "label": "Product Categories", "type": "reference-list", "config": { "ref": "category-data", "key": "category-name", "filter": { "refField": "department", "recordField": "product-department" } } } ``` **Multi-Level Cascading Example** You can create complex, multi-level cascading dropdowns by chaining filtered reference fields together. This example shows a product taxonomy where department selection filters available categories, and category selection filters available subcategories: ```json { "sheets": [ { "name": "Product Taxonomy", "slug": "taxonomy", "fields": [ { "key": "department", "label": "Department", "type": "string" }, { "key": "category", "label": "Category", "type": "string" }, { "key": "subcategory", "label": "Subcategory", "type": "string" } ] }, { "name": "Products", "slug": "products", "fields": [ { "key": "department", "label": "Department", "type": "reference", "config": { "ref": "taxonomy", "key": "department" } }, { "key": "category", "label": "Category", "type": "reference", "config": { "ref": "taxonomy", "key": "category", "filter": { "refField": "department", "recordField": "department" } } }, { "key": "subcategory", "label": "Subcategory", "type": "reference-list", "config": { "ref": "taxonomy", "key": "subcategory", "filter": { "refField": "category", "recordField": "category" } } } ] } ] } ``` **Product Taxonomy Sheet**: | department | category | subcategory | | ----------- | ----------- | ------------- | | Electronics | Computers | Laptops | | Electronics | Computers | Desktops | | Electronics | Computers | Tablets | | Electronics | Audio | Headphones | | Electronics | Audio | Speakers | | Clothing | Men's | Shirts | | Clothing | Men's | Pants | | Clothing | Women's | Dresses | | Clothing | Women's | Shoes | | Books | Fiction | Novels | | Books | Fiction | Short Stories | | Books | Non-Fiction | Biography | | Books | Non-Fiction | History | **Behavior** This creates a three-level cascade: Department → Category → Subcategory, where users can select multiple subcategories from the filtered options. * When **Electronics** is selected in the department field, the category dropdown will only show **Computers** and **Audio** * When **Computers** is selected, the subcategory dropdown will show **Laptops**, **Desktops**, and **Tablets**. Users may then select *multiple* subcategories from the filtered options * When **Clothing** is selected, the category dropdown will only show **Men's** and **Women's** * When **Men's** is selected, the subcategory dropdown will show **Shirts** and **Pants**. Users may then select *multiple* subcategories from the filtered options ## Field Constraints Field constraints are system-level validation rules that enforce data integrity and business logic on data in individual fields. ### required Ensures that a field must have a non-null value. Empty cells are considered null values and will fail this constraint. ```json { "key": "email", "type": "string", "label": "Email Address", "constraints": [{ "type": "required" }] } ``` By default, if a required constraint fails, an error will be added to the field with the message "\ is required". You can override the message and/or the error level of the message by supplying a `config` object with the constraint. For example: ```json { "key": "email", "type": "string", "label": "Email Address", "constraints": [{ "type": "required", "config": { "message": "This record is missing an email address", "level": "warn" } }] } ``` ### unique Ensures that field values appear only once across all records in the sheet. Note that null values can appear multiple times as they don't count toward uniqueness validation. ```json { "key": "employeeId", "type": "string", "label": "Employee ID", "constraints": [{ "type": "unique" }] } ``` By default, if a required constraint fails, an error will be added to the field with the message "Value is not unique". You can override the message and/or the error level of the message by supplying a `config` object with the constraint -- see the example with the required constraint above. ### computed Marks a field as computed, hiding it from the mapping process. Users will not be able to map imported data to fields with this constraint. ```json { "key": "calculatedField", "type": "string", "label": "Calculated Value", "constraints": [{ "type": "computed" }] } ``` Sheet-level constraints that apply to multiple fields are covered in [Sheet Constraints](/core-concepts/sheets#sheet-constraints). ### Field-level access **`readonly`** On a field level you can restrict a user's interaction with the data to `readonly`. This feature is useful if you're inviting others to view uploaded data, but do not want to allow them to edit that field. ```json { "fields": [ { "key": "salary", "type": "number", "readonly": true } ] } ``` ## Field Options Configurable properties for a Field that define its structure and behavior: | Option | Type | Required | Default | Description | | --------------- | ------- | -------- | ------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ | | **key** | string | ✓ | | The system name of this field. Primarily informs JSON and egress structures | | **type** | string | ✓ | | One of `string`, `number`, `boolean`, `date`, `enum`, `reference`, `string-list`, `enum-list`, `reference-list`. Defines the handling of this property | | **label** | string | | key | A user-facing descriptive label designed to be displayed in the UI such as a table header | | **description** | string | | | A long form description of the property intended to be displayed to an end user (supports Markdown) | | **constraints** | array | | \[] | An array of system level validation rules (max 10). [Learn more about field constraints](#field-constraints) | | **config** | object | | {} | Configuration relevant to the type of column. See type documentation below | | **readonly** | boolean | | false | Prevents user input on this field | | **appearance** | object | | {} | UI appearance settings. Currently supports `size` property with values: `"xs"`, `"s"`, `"m"`, `"l"`, `"xl"` | | **actions** | array | | \[] | User actions available for this field. See [Field Actions](/guides/using-actions#field-actions) for detailed configuration | \| **alternativeNames** | array | | \[] | Alternative field names for mapping assistance | \| **metadata** | object | | {} | Arbitrary object of values to pass through to hooks and egress | # Jobs Source: https://flatfile.com/docs/core-concepts/jobs Discrete tasks executed asynchronously in response to events In Flatfile, a Job represents a large unit of work performed asynchronously on a resource such as a file, [Workbook](/core-concepts/workbooks), or [Sheet](/core-concepts/sheets). The Jobs workflow provides visibility into the status and progress of your Jobs, allowing you to monitor and troubleshoot the data processing pipeline. For handling large datasets, see our [multi-part jobs guide](/guides/multi-part-jobs). In these examples, we'll show the full Job Listener lifecycle implementation, complete with `ack` to acknowledge the job, `update` to update the job's progress, and `complete` or `fail` to complete or fail the job. To make this simpler, we provide a plugin called [Job Handler](/plugins/job-handler) that handles the job lifecycle for you. This plugin works by listening to the `job:ready` event and executing the handler callback. There is also an optional `tick` function which allows you to update the Job's progress. For example, this listener implementation contains all the code necessary to handle a job: ```typescript listener.use(jobHandler("workbook:export", async ({ context: { fileId, jobId } }, tick) => { const file = await api.files.get(fileId); tick(10, "Getting started."); // export logic here tick(90, "Exported."); return { outcome: { message: "Exporting file contents is complete.", } } })); ``` ## Job Lifecycle ### Lifecycle Events Jobs fire the following Events during their lifecycle. In chronological order, the Job Events are: | Event | Description | | ------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | `job:created` | Fires when a Job is created, but before it does anything. | | `job:ready` | Fires when a Job is ready to move into the execution stage, but before it does anything. | | `job:updated` | Fires when there is an update to a Job while it is executing. | | `job:completed` OR `job:failed` | `job:completed` fires when a Job is completed successfully, `job:failed` fires if a Job is completed but fails. One of these events will fire upon Job completion, but never both. | | `job:outcome-acknowledged` | Fires when a user acknowledges the completion of a Job through a UI popup. | You can listen on any of these events in your [Listener](/core-concepts/listeners), but the most common event to listen for is `job:ready`. ## Types of Jobs Jobs can be triggered in a number of ways, most commonly in response to user activity. Jobs are then managed via [Listeners](/core-concepts/listeners), which receive [Events](/core-concepts/listeners#events) published by Flatfile in response to activity. There are three types of Jobs on the Flatfile Platform: ### Action-Based Jobs Actions are developer-defined operations that can be mounted on a number of domains (including Sheets, Workbooks, Documents, and Files). Mounting an Action means attaching a custom operation to that domain. That operation can then be triggered by a user event (clicking a button or selecting a menu item). When an Action is triggered a `job:ready` Event for a Job named `[domain]:[operation]` is published. Your [Listener](/core-concepts/listeners) can then be configured to respond to that Action via its Event. To run an Action based job, two configurations are necessary. First, create an [Action](/core-concepts/actions) on a domain. Here's an example of a Workbook containing an Action: ```typescript api.workbook.create({ name: "October Report Workbook", actions: [ { label: "Export Data", description: "Send data to destination system", operation: "export", type: "file", }, ], }); ``` Then, create a [Listener](/core-concepts/listeners) to respond to the Action. This listener should listen for the `job:ready` event, filtered by the `domain:operation` Job - where, in this case, `workbook` is the domain, `export` is the operation. ```typescript listener.on( "job:ready", { job: "workbook:export" }, async ({ context: { jobId } }) => { try { await api.jobs.ack(jobId, { info: "Starting submit job...", progress: 10, }); // Custom code here await api.jobs.complete(jobId, { outcome: { message: "Submit Job was completed successfully.", }, }); } catch (error) { await api.jobs.fail(jobId, { outcome: { message: "This Job failed.", }, }); } } ); ``` ### Custom Jobs Another trigger option is to create a Custom Job via SDK/API. In the SDK, Jobs are created by calling the `api.jobs.create()` method. Creating a custom Job in your Listener enables any Event to trigger a Job. Here's an example of creating a custom Job in a Listener: ```typescript listener.on( "commit:created", { sheet: "contacts" }, async ({ context: { workbookId, sheetId } }) => { const { data } = await api.jobs.create({ type: "workbook", operation: "myCustomOperation", trigger: "immediate", source: workbookId }); } ); ``` Note that the trigger for this Listener is set to immediate, which means that the Job will be created and executed immediately upon the Event firing. Therefore, we should have our Listener ready to respond to this Job: ```typescript listener.on( "job:ready", { job: "workbook:myCustomOperation" }, async ({ context: { jobId, workbookId } }) => { try { await api.jobs.ack(jobId, { info: "Starting my custom operation.", progress: 10, }); // Custom code here. await api.jobs.complete(jobId, { outcome: { message: "Successfully completed my custom operation.", }, }); } catch { await api.jobs.fail(jobId, { outcome: { message: "Custom operation failed.", }, }); } } ); ``` ### System Jobs Internally, Flatfile uses Jobs to power many of the features of the Flatfile Platform, such as extraction, record mutation, and AI Assist. Here are some examples of Jobs that the Flatfile Platform creates and manages on your behalf: | Job Name | Description | | --------------- | ------------------------------------------------------------- | | `Extract` | Extracts data from the specified source. | | `Map` | Maps data from its ingress format to Blueprint fields. | | `DeleteRecords` | Deletes records from a dataset based on specified criteria. | | `Export` | Exports data to a specified format or destination. | | `MutateRecords` | Alters records in a dataset according to defined rules. | | `Configure` | Sets up or modifies the configuration of a Space. | | `AiAssist` | Utilizes AI to assist with tasks such as data categorization. | | `FindReplace` | Searches for specific values and replaces them. | ## Job Parameters ### Required Parameters When creating a job, the following parameters are required: * **type** (string) - Workbook, File, Sheet, Space * **operation** (string) - `export`, `extract`, `map`, `delete`, etc * **source** (string) - The id of the data source (FileId, WorkbookId, or SheetId) ### Optional Parameters * **trigger** (string) - `manual` or `immediate` * **destination** (string) - The id of the data target (if any) * **status** (string) - `created`, `planning`, `scheduled`, `ready`, `executing`, `complete`, `failed`, `cancelled` * **progress** (number) - A numerical or percentage value indicating the completion status of the Job * **estimatedCompletionAt** (date) - An estimated completion time. The UI will display the estimated processing time in the foreground Job overlay * **info** (string) - Additional information regarding the Job's current status * **managed** (string) - Indicates whether the Job is managed by the Flatfile platform or not * **mode** (string) - `foreground`, `background`, `toolbarBlocking` * **metadata** (object) - Additional metadata for the Job. You can store any additional information here, such as the IDs of Documents or Sheets created during the execution of Job Please see our [API Reference](https://reference.flatfile.com/api-reference/jobs/) for details on all possible values. ## Working with Jobs Jobs can be managed via SDK/API. Commonly, Jobs are acknowledged, progressed, and then completed or failed. Here's a look at those steps. ### Acknowledging Jobs First, acknowledge a Job. This will update the Job's status to `executing`. ```typescript await api.jobs.ack(jobId, { info: "Starting submit job...", progress: 10, }); ``` ### Updating Job Progress Once a Job is acknowledged, you can begin running your custom operation. Jobs were designed to handle large processing loads, but you can easily update your user by updating the Job with a progress value. ```typescript await api.jobs.update(jobId, { progress: 50, estimatedCompletionAt: new Date("Tue Aug 23 2023 16:19:42 GMT-0700"), }); ``` `Progress` is a numerical or percentage value indicating the completion status of the work. You may also provide an `estimatedCompletionAt` value which will display your estimate of the remaining processing time in the foreground Job overlay. Additionally, the Jobs Panel will share visibility into the estimated remaining time for acknowledged jobs. ### Completing Jobs Once a job is complete, you can display an alert to the end user using `outcome`. ```typescript await api.jobs.complete(jobId, { outcome: { message: `Operation was completed successfully. ${myData.length} records were processed.`, acknowledge: true, }, }); ``` ### Job Outcomes You can enhance job completion with various outcome options to guide users to their next action: #### Internal Navigation Add a button to the dialog that will redirect the user somewhere within a Space using next > Id. In this code below, we will create a button that says "See all downloads" with this path: space/us\_sp\_1234/files?mode=export ```typescript await api.jobs.complete(jobId, { outcome: { message: `Operation was completed successfully. ${myData.length} records were processed.`, acknowledge: true, next: { type: "id", id: "dev_sp_1234", path: "files", query: "mode=export", label: "See all downloads", }, }, }); ``` #### External Links Add a button to the dialog that will redirect the user to an external link using next > Url. In this code below, we will create a button that says "Go to Google". It will open in a new tab. ```typescript await api.jobs.complete(jobId, { outcome: { message: `Operation was completed successfully. ${myData.length} records were processed.`, acknowledge: true, next: { type: "url", url: "http://www.google.com", label: "Go to Google", }, }, }); ``` #### File Downloads Add a button to the dialog that will redirect the user to an external link using next > Url. In this code below, we will create a button that says "Download this file". ```typescript await api.jobs.complete(jobId, { outcome: { message: `Operation was completed successfully. ${myData.length} records were processed.`, acknowledge: true, next: { type: "download", fileName: "DownloadedFromFlatfile.csv", url: "source_of_file.csv", label: "Download this file", }, }, }); ``` #### Multiple File Downloads Download files hosted on Flatfile by using next > files. In this code below, we will create a button that says "Download files". ```typescript await api.jobs.complete(jobId, { outcome: { message: `The files should download automatically`, next: { type: "files", label: "Download files", files: [{fileId: "us_fl_123"}], }, }, }); ``` #### Snapshots Add a button to the dialog that will redirect the user a particular snapshot using next > snapshot. In this code below, we will create a button that says "Go to Snapshot". ```typescript await api.jobs.complete(jobId, { outcome: { message: `Operation was completed successfully.`, acknowledge: true, next: { type: "snapshot", label: "Go to Snapshot", snapshotId: snapshot.id, sheetId: sheet.id }, }, }); ``` #### View Management Dynamically update visible columns in a sheet using next > view. In this code below, we will create a button that says "Hide columns": ```typescript await api.jobs.complete(jobId, { outcome: { message: `Operation was completed on sheet ${sheet.name}`, acknowledge: true, next: { type: "view", label: "Hide columns", sheetId: sheet.id, hiddenColumns: ['age', 'phone', 'middleName'] // field keys to hide }, }, }); ``` #### Retry Actions Often in the event of a failure, you may want to add a button to the dialog that will retry the Job using next > Retry. Provide retry functionality for failed jobs: ```typescript await api.jobs.complete(jobId, { outcome: { message: `Operation was not completed successfully. No records were processed.`, acknowledge: true, next: { type: "retry", label: "Try again", }, }, }); ``` ### Failing Jobs When a job encounters an error, use the fail method: ```typescript await api.jobs.fail(jobId, { outcome: { message: "This Job failed due to an error.", acknowledge: true, }, }); ``` # Events and Listeners Source: https://flatfile.com/docs/core-concepts/listeners The anatomy of Events and Listeners - core components of the Flatfile Platform ## Events Flatfile publishes an [Event](/core-concepts/listeners#events) whenever something happens - editing a Record, uploading a file, clicking an Action button. Events carry structured information about what occurred, including the context and any relevant data. Each Event includes a domain, topic, context, and optional payload: | Component | Description | Example | | ------------------ | ------------------------------------------------- | ------------------------------------------------------- | | Domain | the jurisdiction of the Event | Record, Sheet, Workbook, Space, file, Job, Agent | | Topic | a combination of a domain and an action | Workbook:created Workbook:updated Workbook:deleted | | Context | detailed information about context of the Event | `{ spaceId: "us_sp_1234", fileId: "us_fl_1234", ... }` | | Payload (optional) | detailed information about execution of the Event | `{ status: "complete", workbookId: "us_wb_1234", ... }` | For a complete list of Events published by Flatfile, see [Events Reference](/reference/events). ## Listeners Listeners are functions you write that respond to Events by executing custom code. They enable all the powerful functionality in your Flatfile implementation: data transformations, validations, integrations, and workflows. Listeners define how your [Spaces](/core-concepts/spaces) behave and what happens when users interact with your data. Listeners can run locally during development or be deployed as [Agents](/core-concepts/listeners#agents) to Flatfile's cloud. Each Listener connects to a specific [Environment](/core-concepts/environments) using that Environment's API keys as environment variables. To switch between different Environments (or promote a development Environment to production), you simply update your environment variables with the corresponding API keys. By default, Listeners respond to Events from all [Apps](/core-concepts/apps) within an [Environment](/core-concepts/environments). You can use [Namespaces](/guides/namespaces-and-filters#namespaces) to partition your Listeners into isolated functions that only respond to Events from specific Apps. ## How Listeners Work A Listener receives Events as they occur and executes a callback function in response. For example, when a file is uploaded, a Listener can mount an [Action](/core-concepts/actions) buton to the file for performing data transformations, or when a [Record](/core-concepts/records) is created, a Listener can validate the data. The callback function can be as simple as logging the Event or as advanced as integrating with external systems. Here is an example that logs the topic of any incoming Event: **Note:** The `**` wildcard is a catch-all that will match all Events. This is useful for testing, but should be avoided in production as it can lead to performance issues. ```javascript JavaScript export default function (listener) { listener.on("**", (event) => { console.log(`Received event: ${event.topic}`); }); } ``` For a more concrete and practical example, follow along with our [Coding Tutorial](/coding-tutorial). ```typescript TypeScript import { FlatfileListener, FlatfileEvent } from '@flatfile/listener'; export default function (listener: FlatfileListener) { listener.on("**", (event: FlatfileEvent) => { console.log(`Received event: ${event.topic}`); }); } ``` ## Listener Methods Flatfile provides two primary methods for configuring listener behavior: ### `listener.on()` The `listener.on()` method is the fundamental pattern for responding to specific events. This method allows you to provide a callback function that will be executed when the specified event occurs. For a complete list of Events published by Flatfile, see [Events Reference](/reference/events). **Type Signature:** ```typescript listener.on( eventPattern: string, callback: (event: FlatfileEvent) => void | Promise ): void listener.on( eventPattern: string, filter: object, callback: (event: FlatfileEvent) => void | Promise ): void ``` **Examples:** ```javascript JavaScript // Basic Event handling listener.on('file:created', async (event) => { console.log(`New file uploaded: ${event.context.fileId}`); }); // Event handling with filters listener.on('commit:created', { sheet: 'contacts' }, async (event) => { console.log('Processing contact data commit...'); // Process contact data }); // Job handling listener.on('job:ready', { job: 'space:configure' }, async (event) => { console.log('Configuring Space...'); // Configure Space }); ``` ```typescript TypeScript import { FlatfileEvent } from '@flatfile/listener'; // Basic Event handling with proper typing listener.on('file:created', async (event: FlatfileEvent) => { console.log(`New file uploaded: ${event.context.fileId}`); }); // Event handling with filters listener.on('commit:created', { sheet: 'contacts' }, async (event: FlatfileEvent) => { console.log('Processing contact data commit...'); // Process contact data }); ``` ### `listener.use()` The `listener.use()` method is for building and distributing listener functions in a modular manner. It allows you to send your own Listener functions as a callback to the method, each of which receives a `FlatfileListener` instance as their argument. Inside that callback function, you can use `listener.on()` to register Event handlers or introduce various [Plugins](/core-concepts/plugins) to your listener. The result can be an `index` file with with little more than `listener.use()` calls, each of which distributes your listener function to other places in your codebase: ```javascript JavaScript // import statements export default function (listener) { // Distribute listener functions to different modules listener.use(validateCustomerData); listener.use(processFileUploads); listener.use(handleDataTransforms); listener.use(setupIntegrations); } ``` ```typescript TypeScript // import statements export default function (listener: FlatfileListener): void { // Distribute listener functions to different modules listener.use(validateCustomerData); listener.use(processFileUploads); listener.use(handleDataTransforms); listener.use(setupIntegrations); } ``` This works particularly well for complex listener functions that need to be broken down into smaller, more manageable pieces. It also works well when combined with [Namespaces](/guides/namespaces-and-filters) to create isolated listener functions that only respond to Events from a specific namespace. You'll find this pattern often when using [Plugins](/core-concepts/plugins) to extend the functionality of your listener. **Type Signature:** ```typescript listener.use(configFunction: (listener: FlatfileListener) => void): void ``` **Examples:** ```javascript JavaScript // Custom configuration function function validateCustomerData(listener) { listener.on('commit:created', { sheet: 'customers' }, async (event) => { // Validation logic here console.log('Validating customer data...'); }); listener.on('record:updated', { sheet: 'customers' }, async (event) => { // Update validation logic here console.log('Revalidating updated customer data...'); }); } // Another configuration function function setupDataProcessing(listener) { listener.on('file:created', async (event) => { console.log('Processing uploaded file...'); }); } export default function (listener) { listener.use(validateCustomerData); listener.use(setupDataProcessing); } ``` ```typescript TypeScript import { FlatfileListener, FlatfileEvent } from '@flatfile/listener'; // Custom configuration function with proper typing function validateCustomerData(listener: FlatfileListener): void { listener.on('commit:created', { sheet: 'customers' }, async (event: FlatfileEvent) => { // Validation logic here console.log('Validating customer data...'); }); listener.on('record:updated', { sheet: 'customers' }, async (event: FlatfileEvent) => { // Update validation logic here console.log('Revalidating updated customer data...'); }); } // Another configuration function function setupDataProcessing(listener: FlatfileListener): void { listener.on('file:created', async (event: FlatfileEvent) => { console.log('Processing uploaded file...'); }); } export default function (listener: FlatfileListener): void { listener.use(validateCustomerData); listener.use(setupDataProcessing); } ``` ### When to Use Each Method | Method | Use For | | ---------------- | ------------------------------------------------------------------ | | `listener.use()` | Creating reusable functions, organizing complex logic into modules | | `listener.on()` | Responding to specific Events, implementing direct Event handlers | ## Event Routing with Namespaces & Filters There are two more Listener methods to note, specifically intended for routing events to specific Listener functions: `listener.namespace()` and `listener.filter()`. **Namespaces** provide architectural boundaries for organizing different parts of your application at the App, Space, and Workbook level: * `space:customer-portal` - Events from Spaces in the "customer-portal" App * `workbook:staging` - Events from Workbooks with "staging" namespace **Event filters** enable granular event targeting based on property values, event types, and conditions: * `{ sheet: 'contacts' }` - Events from the "contacts" Sheet * `{ domain: 'job', 'payload.job': 'space:configure' }` - Space configuration Jobs For comprehensive examples, patterns, and detailed configuration options, see our [Namespaces and Filters guide](/guides/namespaces-and-filters). **Usage Examples:** ```javascript JavaScript export default function (listener) { // Use namespaces for architectural organization listener.namespace('space:customer-portal', (customerListener) => { customerListener.on('commit:created', async (event) => { console.log('Processing customer portal data...'); }); }); // Use filters for granular targeting listener.filter({ sheet: 'contacts' }, (contactsListener) => { contactsListener.on('commit:created', async (event) => { console.log('Contact data committed'); }); }); // Combine both for maximum precision listener.namespace('space:customer-portal', (customerListener) => { customerListener.filter({ sheet: 'enterprise-customers' }, (enterpriseListener) => { enterpriseListener.on('commit:created', async (event) => { console.log('Processing enterprise customer data'); }); }); }); } ``` ```typescript TypeScript import { FlatfileListener, FlatfileEvent } from '@flatfile/listener'; export default function (listener: FlatfileListener): void { // Use namespaces for architectural organization listener.namespace('space:customer-portal', (customerListener: FlatfileListener) => { customerListener.on('commit:created', async (event: FlatfileEvent) => { console.log('Processing customer portal data...'); }); }); // Use filters for granular targeting listener.filter({ sheet: 'contacts' }, (contactsListener: FlatfileListener) => { contactsListener.on('commit:created', async (event: FlatfileEvent) => { console.log('Contact data committed'); }); }); // Combine both for maximum precision listener.namespace('space:customer-portal', (customerListener: FlatfileListener) => { customerListener.filter({ sheet: 'enterprise-customers' }, (enterpriseListener: FlatfileListener) => { enterpriseListener.on('commit:created', async (event: FlatfileEvent) => { console.log('Processing enterprise customer data'); }); }); }); } ``` ## Agents In Flatfile, an **Agent** refers to a server-side listener bundled and deployed to Flatfile to be hosted in our secure cloud. You may deploy multiple Agents to a single Environment, each with its own configurations and codebase. But be careful, as multiple Agents can interfere with each other if not properly managed. Terminology note: **Agents** in Flatfile should not be confused with **AI Agents**. Flatfile Agents are Event Listeners running in Flatfile's cloud infrastructure, whereas AI Agents refer to a semi-autonomous Artificial Intelligence system. ### Agent Deployment To deploy an Agent, run the following command in your terminal from the root directory containing your listener file: ```bash npx flatfile@latest deploy ``` The CLI will automatically examine your listener code and register itself to listen to only the Events your listener is configured to handle. #### Deployment Options **Unique Slug**: Use the `-s` flag to give your Agent a unique slug. This is useful when managing multiple Agents in the same Environment. ```bash npx flatfile@latest deploy -s my-custom-agent ``` **Multiple Agents**: You may deploy multiple Agents to a single Environment, each with its own configurations and codebase. However, be careful as multiple Agents can interfere with each other if not properly managed (see the next section). If no slug is provided: * **Single Agent**: Your Agent will be updated and given a slug of `default` * **Multiple Agents**: The CLI will prompt you to select which Agent to update ### Managing Multiple Agents When deploying multiple Agents to the same Environment, careful management is essential to prevent race conditions and conflicts. Multiple Agents listening to the same Event topics can interfere with each other and cause unpredictable behavior. We recommend combining your code into a single codebase and using [Namespaces and Filters](/guides/namespaces-and-filters) to route events to scoped listeners, ideally organized into separate subdirectories. However, if you need multiple Agents, follow these strategies to avoid conflicts: #### Conflict Prevention Strategies 1. **Use Namespaces and Filters**: Ensure each Agent listens to distinct Event patterns 2. **Segment by Domain**: Separate Agents by functional areas (e.g., data processing vs. integrations) and separate your accout into different [Apps](/core-concepts/apps) 3. **Avoid Topic Overlap**: Never have multiple Agents handling the same Event topics without proper routing ### Agent Configuration When deploying an Agent, the CLI will automatically reduce the topic scope your Agent listens to by examining your listener code and registering itself to listen to only the topics your listener is configured to act on. For example if your listener code only listens to `commit:created` Events, the CLI will automatically configure the Agent to only listen to `commit:created` Events. If your listener has a wildcard listener `**`, the CLI will configure the Agent to listen to all Events. ### Agent Logs When using Flatfile's secure cloud to host your listener, you can view the executions of your Agent in the "Event Logs" tab of your dashboard. Event logs are useful in monitoring and troubleshooting your listener. Each execution of your agent is recorded here, including any custom console logs you have configured in your listener code. If you are running your Agent locally using the `develop` command to test, these logs will not be recorded in your dashboard. You can still view them in your terminal. #### Failures Failures occur when an Agent fails to execute properly. That means either an uncaught error is thrown, or the execution times out. If you are catching and handling errors within your code, those executions will not be marked as failures. If you would prefer to see them marked this way, re-throw your error after handling to bubble it up to the execution handler. ```javascript JavaScript export default function (listener) { //note: listening to all Events with a wildcard can be used while testing but is not //recommended for production, as it will capture all Events and may cause performance issues listener.on("**", (event) => { try { // do something } catch (error) { // handle error throw error; // re-throw error to mark execution as failure } }); } ``` ```typescript TypeScript import { FlatfileListener, FlatfileEvent } from '@flatfile/listener'; export default function (listener: FlatfileListener) { //note: listening to all Events with a wildcard can be used while testing but is not //recommended for production, as it will capture all Events and may cause performance issues listener.on("**", (event: FlatfileEvent) => { try { // do something } catch (error) { // handle error throw error; // re-throw error to mark execution as failure } }); } ``` # Overview Source: https://flatfile.com/docs/core-concepts/overview Understanding Flatfile's building blocks and architecture Our platform follows a hierarchical structure designed to provide secure, organized access to various resources and automation capabilities. Understanding these core components and their relationships will help you build more effective data import solutions. ## Platform Architecture Flatfile operates as a hosted platform that integrates with your application through APIs and SDKs. Your users interact with Flatfile's interface while you maintain control over the data flow and business logic through event-driven listeners and customizable workflows. ## Core Concepts Understanding Flatfile's architecture starts with these fundamental concepts, organized by their role in the platform: ### Infrastructure & Organization Isolated entities that provide secure domains for creating and testing different configurations, with separate API keys for development and production use cases. Organizational units that manage and coordinate data import workflows across different environments, serving as containers for organizing related Spaces with consistent configuration. Micro-applications with their own database, filestore, and authentication that serve as isolated workspaces for individual data import sessions or customer data management. ### Data Structure Complete data definitions that control how your data should look, behave, and connect, serving as intelligent templates that define structure, validation rules, and relationships between sheets. Standalone webpages within Spaces that provide guidance, instructions, and dynamically-generated content to help users navigate their data import process. Database-like containers configured with type-strict Blueprints that replace spreadsheet templates for data collection, allowing users to validate, correct, and import data with real-time feedback. Individual data tables within Workbooks that organize and structure imported data, similar to database tables or spreadsheet tabs, with each sheet representing a distinct data type or entity. Individual column definitions within sheets that specify data type, format, and validation constraints for each piece of data, similar to columns in a spreadsheet or database table. Individual data rows that contain actual imported values conforming to field definitions, representing the data instances that flow through import workflows from upload to export. ### Automation & Processing Form the foundation of Flatfile's automation - Events are published whenever something happens in Flatfile, and Listeners are functions you write that respond to these Events by executing custom code. Large units of work performed asynchronously on resources like files, Workbooks, or Sheets, providing visibility into the status and progress of data processing tasks. Jobs are powered by [Events](/core-concepts/listeners#events), and are run within by [Listeners](/core-concepts/listeners). Code-based operations that run when users click buttons or menu items in Flatfile, which can be mounted on Sheets, Workbooks, Documents, or Files to trigger custom operations. Actions are powered by [Jobs](/core-concepts/jobs), and are run within [Listeners](/core-concepts/listeners). Reusable, modular components that extend Flatfile's data import and processing capabilities, providing pre-built functionality for common tasks like data transformation and integrations. # Plugins Source: https://flatfile.com/docs/core-concepts/plugins Reusable functionality modules that extend Flatfile's capabilities Plugins in Flatfile are reusable, modular components that extend the platform's data import and processing capabilities. They provide pre-built functionality for common data import/extraction, transformation, and processing tasks, integrations, and workflows, enabling rapid development of sophisticated import solutions. ## What are Plugins? A Plugin is a self-contained module that: * **Extends Functionality** - Adds new capabilities to your import workflows * **Encapsulates Logic** - Packages complex operations into reusable components * **Provides Integration** - Connects to external systems and services * **Enables Customization** - Allows configuration for specific use cases Plugins can be used by [Listeners](/core-concepts/listeners) to perform specialized tasks without writing custom code. You can also create your own plugins to extend Flatfile's capabilities. For detailed guides on our available plugins, see the [Plugins](/plugins) section. # Records Source: https://flatfile.com/docs/core-concepts/records Individual data rows that represent your imported data in Flatfile Records in Flatfile represent individual rows of data within your import workflows. They are instances of data that conform to the [field](/core-concepts/fields) definitions in your [blueprints](/core-concepts/blueprints), containing the actual values that users import, validate, and transform. ## What are Records? A Record is a data instance that: * **Contains Data Values** - Stores the actual imported data for each [field](/core-concepts/fields) * **Maintains State** - Tracks validation status, errors, and processing history * **Supports Transformation** - Allows data modification and enrichment Records are the fundamental data units that flow through your import pipelines, from initial upload through final export. ## Record Structure ### Basic Record Anatomy This is an example JSON response from the Flatfile API. For working with Records in your language, see our [API Reference](/api-reference) documentation ```json { "data": { "records": [ { "id": "us_rc_YOUR_ID", "values": { "firstName": { "messages": [], "updatedAt": "2023-11-20T16:59:40.286Z", "valid": true, "value": "John" }, "lastName": { "messages": [], "updatedAt": "2023-11-20T16:59:40.286Z", "valid": true, "value": "Smith" }, "email": { "messages": [], "updatedAt": "2023-11-20T16:59:40.286Z", "valid": true, "value": "john.smith@example.com" } }, "valid": true, "metadata": {}, "config": {} } ], "success": true, "commitId": "us_vr_YOUR_ID", "counts": { "total": 1000, "valid": 1000, "error": 0 }, "versionId": "us_vr_YOUR_ID" } } ``` # Sheets Source: https://flatfile.com/docs/core-concepts/sheets Individual data tables within Workbooks that organize and structure imported data ## What are Sheets? Sheets are individual data tables within [Workbooks](/core-concepts/workbooks) that organize and structure imported data. Each Sheet represents a distinct data type or entity, similar to tables in a database or tabs in a spreadsheet. Sheets serve as containers for [Records](/core-concepts/records) and are defined by [Blueprints](/core-concepts/blueprints) that specify their structure, validation rules, and data types. They provide the fundamental building blocks for organizing data within the Flatfile platform. ## Basic Blueprint Structure * A [Blueprint](/core-concepts/blueprints) defines the data structure for any number of [Spaces](/core-concepts/spaces) * A [Space](/core-concepts/blueprints) may contain many [Workbooks](/core-concepts/workbooks) and many [Documents](/core-concepts/documents) * A [Document](/core-concepts/documents) contains static documentation and may contain many [Document-level Actions](/guides/using-actions#document-actions) * A [Workbook](/core-concepts/workbooks) may contain many [Sheets](/core-concepts/sheets) and many [Workbook-level Actions](/guides/using-actions#workbook-actions) * A [Sheet](/core-concepts/sheets) may contain many [Fields](/core-concepts/fields) and many [Sheet-level Actions](/guides/using-actions#sheet-actions) * A [Field](/core-concepts/fields) defines a single column of data, and may contain many [Field-level Actions](/guides/using-actions#field-actions) ## Basic Sheet Definition The following examples demonstrate the configuration of isolated Sheets, which are intended to be used in the context of a [Workbook](/core-concepts/workbooks) configuration. ### Single-Sheet Sheet Configuration This example configures a single [Sheet](/core-concepts/sheets) containing three [Fields](/core-concepts/fields) and one [Action](/core-concepts/actions) and defining [access controls](#sheet-level-access). ```javascript const customerSheet = { name: "Customers", slug: "customers", fields: [ { key: "firstName", type: "string", label: "First Name", constraints: [{ type: "required" }], }, { key: "email", type: "string", label: "Email Address", constraints: [{ type: "required" }, { type: "unique" }], }, { key: "company", type: "string", label: "Company Name", }, ], access: ["add", "edit", "delete"], actions: [ { operation: "validate-inventory", mode: "background", label: "Validate Inventory", description: "Check product availability against inventory system", }, ], }; ``` ## Sheet level access With `access` you can control Sheet-level access for users. | Access Level | Description | | ----------------- | ----------------------------------------------------------------------------------------------- | | `"*"` *(default)* | A user can use all access actions to this Sheet | | `"add"` | A user can add a record(s) to the Sheet | | `"delete"` | A user can delete record(s) from the Sheet | | `"edit"` | A user can edit records (field values) in the Sheet | | `"import"` | A user can import CSVs to this Sheet | | `` | If no parameters are specified in the access array, sheet-level readOnly access will be applied | {" "} If you use `"*"` access control, users will gain new functionalities as we expand access controls. Use an exhaustive list today to block future functionality from being added automatically. ```javascript { "sheets": [ { "name": "Contacts", "slug": "contacts", "access": ["add", "edit"] // Define fields } ] } ``` ## Sheet Constraints Sheet constraints apply validation rules across multiple fields or entire sheets. These constraints ensure data integrity at the sheet level and work in conjunction with field-level constraints. ### Composite Uniqueness Ensures that combinations of multiple field values are unique across all records in the sheet. This is useful when individual fields can have duplicate values, but their combination should be unique. ```javascript { name: "Customers", slug: "customers", constraints: [ { name: "unique-customer-location", type: "unique", fields: ["customerId", "locationId"], requiredFields: ["customerId"], // Only enforce when customerId has a value strategy: "concat", config: { message: "Customers must be unique per location", level: "error" } } ], fields: [ // field definitions ] } ``` #### Configuration Properties | Property | Type | Required | Description | | ---------------- | --------- | -------- | ------------------------------------------------------------------------------------ | | `name` | string | ✓ | Unique identifier for the constraint | | `type` | string | ✓ | Must be `"unique"` for composite uniqueness constraints | | `fields` | string\[] | ✓ | Array of field names that must be unique together | | `requiredFields` | string\[] | | Subset of `fields` that when empty, disables constraint validation | | `strategy` | string | ✓ | Either `"concat"` or `"hash"` - determines how uniqueness is calculated | | `config` | object | | Configuration options for the validation message and level when the constraint fails | **Strategy Options:** | Strategy | Description | Best Used When | | -------- | ------------------------------------------------------ | ---------------------------------------------------------- | | `concat` | Concatenates field values to determine uniqueness | Simple field types, debugging needed, performance critical | | `hash` | Uses SHA1 hash of combined field values for uniqueness | Complex values, collision avoidance, data privacy | **Config Options:** | Key | Type | Description | | --------- | ------ | --------------------------------------------------------------------------------------------------------------------- | | `message` | string | The message shown to the user if this constraint is violated. Defaults to "Composite \ is not unique | | `level` | string | The error level when this constraint is violated. Must be one of `"error"` (default), `"warn"` or `"info`". | #### Choosing the Right Strategy The `strategy` property determines how uniqueness is calculated. You can choose to simply concatenate the field values as a single string or use a SHA1 hash function to create a unique identifier. Consider the following when choosing a strategy: **Use `concat` when:** * You aren't concerned about concatenation collisions (see example below) * Performance is critical (string concatenation is faster than SHA1) * You have short/consistent value sizes **Use `hash` when:** * You want to avoid concatenation collisions * You aren't concerned about the performance cost (SHA1 calculation is slower than string concatenation) * You have long/inconsistent value sizes (SHA1 hashes are always 20 bytes) **Concatenation Collision Example:** In this example, the `concat` strategy would consider the following records to be duplicates: ```json { "firstName": "John", "lastName": "Smith" } ``` ```json { "firstName": "JohnS", "lastName": "mith" } ``` Constraint configuration with `concat` strategy: ```javascript { // sheet configuration constraints: [ { name: "unique-name-combination", type: "unique", fields: ["firstName", "lastName"], strategy: "concat" } ] } ``` This is because both records have the same concatenated value: ```javascript "John" + "Smith" = "JohnSmith" "JohnS" + "mith" = "JohnSmith" ``` But the `hash` strategy would prevent this, because the hash function creates a unique identifier based on each field's invividual value rather than a simple concatenation. ```javascript hash(["John", "Smith"]) = "9e03c21f9beff9d943843c1b0623848fe63e2beb" hash(["JohnS", "mith"]) = "2fd09d2f5e62d980988094b43640966a3bffbde9" ``` Constraint configuration with `hash` strategy: ```javascript { // sheet configuration constraints: [ { name: "unique-name-combination", type: "unique", fields: ["firstName", "lastName"], strategy: "hash" // Prevents collision } ] } ``` #### Conditional Validation with Required Fields The `requiredFields` property enables conditional uniqueness validation. When any field specified in `requiredFields` is empty (null, undefined, or empty string), the entire constraint is ignored for that record. **Use Cases:** * **Partial data imports** - Allow incomplete records during staged import processes * **Optional relationships** - Handle cases where some composite key fields are optional * **Data migration** - Gradually enforce constraints as required fields get populated * **Conditional business rules** - Only enforce uniqueness when critical fields have values **Important Notes:** * `requiredFields` should contain only fields that exist in the `fields` array * If ANY required field is empty, the constraint is completely ignored * Empty fields are: `null`, `undefined`, or empty strings (`""`) * If `requiredFields` is omitted, the constraint always applies #### Example: Customer Registration System Consider a customer registration system where you want unique combinations of email and company, but only when `email` is provided: ```javascript { name: "unique-customer-profile", type: "unique", fields: ["email", "company"], requiredFields: ["email"], // Only enforce when email exists strategy: "hash" } ``` **Data Behavior:** | Email | Company | Validation Result | Reason | | ----------------- | ------------- | ----------------- | ------------------------------------ | | `"john@acme.com"` | `"Acme Corp"` | ✅ Enforced | Email provided | | `"jane@acme.com"` | `"Acme Corp"` | ❌ Duplicate | Same email+company combination | | `""` | `"Acme Corp"` | ⏭️ Ignored | Email empty (required field) | | `null` | `"Beta Inc"` | ⏭️ Ignored | Email null (required field) | | `"bob@beta.com"` | `""` | ✅ Enforced | Email provided, company can be empty | **Without `requiredFields`:** All records would be validated, potentially causing errors during partial data imports. #### Example: Multiple Required Fields For more complex scenarios, you can specify multiple required fields: ```javascript { name: "unique-order-item", type: "unique", fields: ["orderId", "productId", "customerId"], requiredFields: ["orderId", "productId"], // Both must have values strategy: "hash" } ``` In this case, the constraint only applies when **both** `orderId` and `productId` have non-empty values. If either is empty, the entire constraint is ignored. Individual field constraints like required and unique are covered in [Field Constraints](/core-concepts/fields#field-constraints). ## Collections Collections provide a way to organize Sheets within a Workbook into named groupings in the Flatfile UI. This helps to visually organize complex Workbooks with many Sheets, making it easier to navigate and understand the data structure. Collections have no functional impact on the data itself; their only purpose is to help you organize your Sheets visually. ### Configuring Sheets with Collections Assigning a Collection to a Sheet is as simple as adding a `collection` property to your Sheet [Blueprint](/core-concepts/blueprints) when [configuring your Space](/core-concepts/spaces#space-configuration) (or otherwise creating or updating a Sheet via the API). If you add the same Collection to multiple Sheets, they will be grouped together in the UI. The following example depicts a Blueprint defining a single Workbook with three Sheets organized into two Collections: **Source Data** with one Sheet, and **Processed** with two Sheets. Workbooks also have a similar feature called [Folders](/core-concepts/workbooks#folders), which you can use to group associated Workbooks. You can think of **Folders** and **Collections** like a filing system: * [Folders](/core-concepts/workbooks#folders) help you organize your Workbooks within a Space (like organizing binders on a shelf) * **Collections** help you organize Sheets within each Workbook (like organizing tabs within a binder). ```javascript const workbook = { name: "Data Processing Pipeline", sheets: [ { name: "Raw Customer Data", collection: "Source Data", // Assign to "Source Data" collection (only Sheet) slug: "raw_customers", fields: [ { key: "name", type: "string", label: "Customer Name" }, { key: "email", type: "string", label: "Email Address" } ] }, { name: "Processed Customers", collection: "Processed", // Assign to "Processed" collection (first Sheet) slug: "clean_customers", fields: [ { key: "name", type: "string", label: "Full Name" }, { key: "email", type: "string", label: "Email" }, { key: "validation_status", type: "string", label: "Status" } ] }, { name: "Customer Reports", collection: "Processed", // Assign to "Processed" collection (second Sheet) slug: "customer_reports", fields: [ { key: "metric", type: "string", label: "Metric Name" }, { key: "value", type: "number", label: "Value" } ] } ] }; ``` ## Sheet Treatments Sheets have an optional `treatments` parameter which takes an array of treatments for your Sheet. Treatments can be used to categorize your Sheet and control its behavior. Certain treatments will cause your Sheet to look or behave differently. ### Reference sheets Giving your Sheet a treatment of `"ENUM_REFERENCE"` will mark it as reference data for other sheets. Reference sheets are currently hidden from view, allowing you to generate a number of reference values without adding visual distraction for the user. **Dynamic Enums** This feature, along with the [Reference Field Filtering](/core-concepts/fields#reference-field-filtering) feature, may be collectively referred to as **Dynamic Enums**. By combining these two features, you can create a drop-down list for any cell in your sheet that's dynamically controlled by the value of another field in the same record – and to the end-user, it will just work like a dynamically-configured `enum` field. Please note that this feature is not intended for situations where PII or other sensitive data must be hidden from view of the user - for situations like that, reach out to support or your CSM for best practices. ```javascript const referenceSheet = { name: "Countries", slug: "countries", treatments: ["ENUM_REFERENCE"], fields: [ { key: "code", type: "string", label: "Country Code" }, { key: "name", type: "string", label: "Country Name" } ] }; ``` Currently, `"ENUM_REFERENCE"` is the only treatment that changes the behavior of your Sheet. ## Sheet Options Configurable properties for a Sheet that control its behavior and appearance: | Option | Type | Required | Description | | ------------------------------ | ------- | -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | **name** | string | ✓ | The name of your Sheet as it will appear to your end users | | **description** | string | | A sentence or two describing the purpose of your Sheet | | **slug** | string | | A unique identifier for your Sheet. Used to reference your Sheet in code, for example in a Record Hook | | **readonly** | boolean | | A boolean specifying whether or not this sheet is read only. Read only sheets are not editable by end users | | **allowAdditionalFields** | boolean | | When set to `true`, your Sheet will accept additional fields beyond what you specify in its configuration. These additional fields can be added via API, or by end users during the file import process | | **access** | array | | An array specifying the access controls for this Sheet. Valid values: `"*"`, `"add"`, `"edit"`, `"delete"`, `"import"`. [Read more about access controls](#sheet-level-access) | | **fields** | array | ✓ | This is where you define your Sheet's data schema. The collection of fields in your Sheet determines the shape of data you wish to accept (minimum: 1, maximum: 1000) | | **actions** | array | | An array of actions that end users can perform on this Sheet. [Read more about actions](/core-concepts/actions) | | **constraints** | array | | An array of sheet-level validation constraints that apply across multiple fields or entire sheets. [Read more about sheet constraints](#sheet-constraints) | | **metadata** | object | | Use `metadata` to store any extra contextual information about your Sheet. Must be valid JSON | | **mappingConfidenceThreshold** | number | | Configure the minimum required confidence for mapping jobs targeting that Sheet (default: 0.5, range: 0-1) | # Spaces Source: https://flatfile.com/docs/core-concepts/spaces Micro-applications for content and data storage Flatfile Spaces are micro-applications, each having their own database, filestore, and auth. Use Spaces to integrate Flatfile into your data exchange workflow, whether that happens directly in your application or as part of an offline process. You can think of a Space as a home for any one of your Customers' data, or as place for a discrete data migration session; you can create a new Space any time you need an isolated place to migrate new data. Terminology note: **Spaces** are sometimes referred to as **Projects** in Flatfile conversations - this is because we often consider a Space to be a single import or series of imports from a single end-customer, or, a "Project". ## Anatomy The following example depicts a **Space** with: * 1 [Document](/core-concepts/documents) named **Customers** * 2 [Workbooks](/core-concepts/workbooks): * **Customers Workbook** * 3 [Sheets](/core-concepts/sheets) named **Customers**, **Orders**, and **Products** * **Company Workbook** * 3 [Sheets](/core-concepts/sheets) named **Users**, **Departments**, and **Projects** * 2 [Workbook Actions](/core-concepts/actions) labeled **Download** and **Submit** The current view has the **Users** sheet under the **Company Workbook** selected. ## Basic Blueprint Structure * A [Blueprint](/core-concepts/blueprints) defines the data structure for any number of [Spaces](/core-concepts/spaces) * A [Space](/core-concepts/blueprints) may contain many [Workbooks](/core-concepts/workbooks) and many [Documents](/core-concepts/documents) * A [Document](/core-concepts/documents) contains static documentation and may contain many [Document-level Actions](/guides/using-actions#document-actions) * A [Workbook](/core-concepts/workbooks) may contain many [Sheets](/core-concepts/sheets) and many [Workbook-level Actions](/guides/using-actions#workbook-actions) * A [Sheet](/core-concepts/sheets) may contain many [Fields](/core-concepts/fields) and many [Sheet-level Actions](/guides/using-actions#sheet-actions) * A [Field](/core-concepts/fields) defines a single column of data, and may contain many [Field-level Actions](/guides/using-actions#field-actions) ### Blueprints and Space Configuration Every Space has exactly one [Blueprint](/core-concepts/blueprints) that defines its data structure. When you configure a Space by creating workbooks, sheets, and fields, you're defining the Space's Blueprint - the schema that controls how data should look, behave, and connect within that Space. The Blueprint is established during the `space:configure` [Job](/core-concepts/jobs) and remains consistent throughout the Space's lifecycle (though you can change it via the [API](https://reference.flatfile.com/api-reference/workbooks/update) at any time). This ensures all data imported into the Space follows the same structure and validation rules you've defined. By default, the same Blueprint configuration is applied to all Spaces unless you [namespace your listeners](/guides/namespaces-and-filters) to specific scopes. This makes it easy to maintain consistent data structures across multiple Spaces for similar use cases or different customers with the same data requirements. ## Creating Spaces Spaces can be created manually via your [Flatfile Dashboard](https://platform.flatfile.com/dashboard), or — quite commonly — programmatically via the [Flatfile API](https://reference.flatfile.com/api-reference/spaces/create). Here are some example workflows for creating Spaces: ### Backend Automation When integrating Flatfile into your application workflows: * **CRM Integration**: When a customer in your CRM moves to the "onboarding" stage, you may create a new Space via the API for them to complete their data onboarding process. * **Account Creation**: When a new account is created in your application, you may automatically create a new Space for them via the API. ```mermaid graph TD A[CRM Customer] -->|Moves to onboarding stage| B[Create Space via API] C[New Account Created] -->|In your application| D[Create Space via API] B --> E[Customer completes data onboarding] D --> F[Account setup and data import] ``` ### User-Initiated Creation For direct customer collaboration and embedded applications: * **Direct Collaboration**: When working with a new customer directly, you may create a new Space via the Dashboard and invite them to the Space. * **Embedding Flatfile**: In an [Embedded App](/embedding/overview), you may create a new Space every time a user clicks a button to start a new import. ```mermaid graph TD A[Working with Customer] -->|Direct collaboration| B[Create Space via Dashboard] C[Embedded App User] -->|Clicks import button| D[Create Space via API] B --> E[Invite customer to Space] D --> F[User starts new import session] ``` You may also rename "Space" to any other term you prefer, such as "Project" or "Customer". Each App may have a different name for a Space, but they all have the same underlying structure. ### Space Namespaces Spaces can be assigned [namespaces](/guides/namespaces-and-filters) to control which [listeners](/core-concepts/listeners) receive events from that space. This enables isolation between different parts of your application or different customers. See the [Namespaces and Filters guide](/guides/namespaces-and-filters) for detailed examples and patterns. ## Space Configuration For complete theming capabilities, see our comprehensive [Theme Your Space guide](/guides/theme-your-space). If you're using [Autobuild](/getting-started/quickstart/autobuild), space configuration is handled automatically via its own deployed agent. If you're configuring Flatfile in code, you can configure the Space via a [Listener](/core-concepts/listeners). When a new space is created, Flatfile automatically creates a `space:configure` [Job](/core-concepts/jobs) for you to configure the Space. In your job listener, you can create workbooks, sheets, and actions, as well as configure the Space's theme and metadata. This workbook and sheet configuration defines your Space's [Blueprint](/core-concepts/blueprints) - the data schema that controls how your data should look, behave, and connect. In this example, we'll show the full Job Listener lifecycle implementation, complete with `ack` to acknowledge the job, `update` to update the job's progress, and `complete` or `fail` to complete or fail the job. As in other places throughout the documentation, you *could* use the [Job Handler](/plugins/job-handler) plugin to handle the job lifecycle for you. This may be useful if you have a particularly complex space configuration with custom logic, but don't need complete control over the job's entire lifecycle. However, for most implementations, we recommend using the [Space Configure](/plugins/space-configure) plugin. This plugin takes care of even more of the heavy lifting for you; not only does it handle the Job lifecycle, but it also takes care of all of the API calls necessary to configure the Space and create its Workbooks and documents. With this plugin, you can configure your entire space with a single configuration object rather than perforing any API calls. For example, this listener implementation configures a space with a [Blueprint](/core-concepts/blueprints) containing two [workbooks](/core-concepts/workbooks) and a Welcome Guide [document](/core-concepts/documents) – both defined in another file – as well as adding some light [theming](/guides/theme-your-space): ```javascript listener.use(configureSpace({ workbooks: [companyWorkbook, customerWorkbook], documents: [welcomeGuideDocument], space: { metadata: { theme: { root: { primaryColor: "#4F46E5", actionColor: "#000000" }, sidebar: { logo: "www.example.com/logo.png" }, } } } })); ``` Here's how to manually configure a Space using the job lifecycle approach. This example demonstrates creating a [Blueprint](/core-concepts/blueprints) by defining workbooks with sheets and fields, custom actions, a welcome document, and theme customization, with complete control over the configuration process and proper job lifecycle management. ```javascript JavaScript import { FlatfileListener } from "@flatfile/listener"; import api from "@flatfile/api"; export default function (listener) { // Listen for the space:configure job listener.on("job:ready", { job: "space:configure" }, async (event) => { // Get the job ID and space ID from the event context const { jobId, spaceId } = event.context; try { // Acknowledge the job await api.jobs.ack(jobId, { info: "Setting up customer data workspace", progress: 10 }); // Create the workbook with a sheet and an action const { data: workbook } = await api.workbooks.create({ spaceId, name: "Customer Data", sheets: [ { name: "Contacts", slug: "contacts", fields: [ { key: "firstName", type: "string", label: "First Name" }, { key: "lastName", type: "string", label: "Last Name" }, { key: "email", type: "string", label: "Email Address" }, { key: "phone", type: "string", label: "Phone Number" } ] } ], actions: [ { operation: "submit", label: "Submit Data", mode: "foreground", primary: true, description: "Submit customer data for processing", constraints: [{ type: "hasData" }] } ] }); // Update progress await api.jobs.update(jobId, { info: "Customer workbook created", progress: 50 }); // Create a welcome document await api.documents.create(spaceId, { title: "Welcome Guide", body: `

Welcome to Customer Data Import

This Space helps you import and validate customer contact information.

Getting Started

  1. Upload your CSV file or enter data manually
  2. Review and correct any validation errors
  3. Click "Submit Data" when ready
` }); // Update progress await api.jobs.update(jobId, { info: "Welcome document added", progress: 75 }); // Configure the space theme await api.spaces.update(spaceId, { metadata: { theme: { root: { primaryColor: "#4F46E5", borderColor: "#E5E7EB", fontFamily: "Inter, system-ui, sans-serif" }, sidebar: { backgroundColor: "#1F2937", textColor: "#F9FAFB", titleColor: "#4F46E5" } } } }); // Complete the job await api.jobs.complete(jobId, { outcome: { message: "Customer data workspace configured successfully", acknowledge: true } }); } catch (error) { console.error("Error configuring space:", error); // Fail the job if something goes wrong await api.jobs.fail(jobId, { outcome: { message: `Failed to configure space: ${error.message}`, acknowledge: true } }); } }); } ``` ```typescript TypeScript import type { FlatfileListener } from "@flatfile/listener"; import api from "@flatfile/api"; export default function (listener: FlatfileListener) { // Listen for the space:configure job listener.on("job:ready", { job: "space:configure" }, async (event) => { // Get the job ID and space ID from the event context const { jobId, spaceId } = event.context; try { // Acknowledge the job await api.jobs.ack(jobId, { info: "Setting up customer data workspace", progress: 10 }); // Create the workbook with a sheet and an action const { data: workbook } = await api.workbooks.create({ spaceId, name: "Customer Data", sheets: [ { name: "Contacts", slug: "contacts", fields: [ { key: "firstName", type: "string", label: "First Name" }, { key: "lastName", type: "string", label: "Last Name" }, { key: "email", type: "string", label: "Email Address" }, { key: "phone", type: "string", label: "Phone Number" } ] } ], actions: [ { operation: "submit", label: "Submit Data", mode: "foreground", primary: true, description: "Submit customer data for processing", constraints: [{ type: "hasData" }] } ] }); // Update progress await api.jobs.update(jobId, { info: "Customer workbook created", progress: 50 }); // Create a welcome document await api.documents.create(spaceId, { title: "Welcome Guide", body: `

Welcome to Customer Data Import

This Space helps you import and validate customer contact information.

Getting Started

  1. Upload your CSV file or enter data manually
  2. Review and correct any validation errors
  3. Click "Submit Data" when ready
` }); // Update progress await api.jobs.update(jobId, { info: "Welcome document added", progress: 75 }); // Configure the space theme await api.spaces.update(spaceId, { metadata: { theme: { root: { primaryColor: "#4F46E5", borderColor: "#E5E7EB", fontFamily: "Inter, system-ui, sans-serif" }, sidebar: { backgroundColor: "#1F2937", textColor: "#F9FAFB", titleColor: "#4F46E5" } } } }); // Complete the job await api.jobs.complete(jobId, { outcome: { message: "Customer data workspace configured successfully", acknowledge: true } }); } catch (error) { console.error("Error configuring space:", error); // Fail the job if something goes wrong await api.jobs.fail(jobId, { outcome: { message: `Failed to configure space: ${error.message}`, acknowledge: true } }); } }); } ```
# Workbooks Source: https://flatfile.com/docs/core-concepts/workbooks Database-like containers with type-strict Blueprints for data import ## What are Workbooks? Workbooks are analogous to a database, and like a database, they are configured with a type-strict [Blueprint](/core-concepts/blueprints). A Workbook replaces the spreadsheet template you may share with your users today when requesting data during the data collection phase of customer onboarding or other file-based data exchange processes. Unlike a spreadsheet, Workbooks are designed to allow your team and users to validate, correct, and import data with real-time feedback. Workbooks are effectively *containers* for related [Sheets](/core-concepts/sheets). Inside a any Workbook, you can define multiple Sheets that can [reference](/core-concepts/fields#reference) each other's data. ## Basic Blueprint Structure * A [Blueprint](/core-concepts/blueprints) defines the data structure for any number of [Spaces](/core-concepts/spaces) * A [Space](/core-concepts/blueprints) may contain many [Workbooks](/core-concepts/workbooks) and many [Documents](/core-concepts/documents) * A [Document](/core-concepts/documents) contains static documentation and may contain many [Document-level Actions](/guides/using-actions#document-actions) * A [Workbook](/core-concepts/workbooks) may contain many [Sheets](/core-concepts/sheets) and many [Workbook-level Actions](/guides/using-actions#workbook-actions) * A [Sheet](/core-concepts/sheets) may contain many [Fields](/core-concepts/fields) and many [Sheet-level Actions](/guides/using-actions#sheet-actions) * A [Field](/core-concepts/fields) defines a single column of data, and may contain many [Field-level Actions](/guides/using-actions#field-actions) ## Example Workbook Configuration The following examples demonstrate the configuration of isolated Workbooks, which are intended to be used in the context of a [Blueprint](/core-concepts/blueprints) configuration. ### Single-Sheet Workbook Configuration This example configures a single [Workbook](/core-concepts/workbooks) with a single [Sheet](/core-concepts/sheets) containing two [Fields](/core-concepts/fields). ```javascript const customerWorkbook = { name: "Customer Import", description: "Import and validate customer data", sheets: [ { name: "customers", slug: "customers", fields: [ { key: "name", type: "string", label: "Full Name", constraints: [{ type: "required" }], }, { key: "email", type: "string", label: "Email Address", constraints: [{ type: "required" }, { type: "unique" }], }, ], }, ], }; ``` ### Multi-Sheet Workbook Configuration This example configures a [Workbook](/core-concepts/workbooks) with three [Sheets](/core-concepts/sheets) containing several [Fields](/core-concepts/fields) each. ```javascript const ecommerceWorkbook = { name: "E-commerce Data Import", description: "Import customers, products, and orders", sheets: [ { name: "customers", slug: "customers", fields: [ { key: "customerId", type: "string", label: "Customer ID" }, { key: "email", type: "string", label: "Email" }, { key: "firstName", type: "string", label: "First Name" }, { key: "lastName", type: "string", label: "Last Name" }, ], }, { name: "products", slug: "products", fields: [ { key: "productId", type: "string", label: "Product ID" }, { key: "name", type: "string", label: "Product Name" }, { key: "price", type: "number", label: "Price" }, { key: "category", type: "string", label: "Category" }, ], }, { name: "orders", slug: "orders", fields: [ { key: "orderId", type: "string", label: "Order ID" }, { key: "customerId", type: "reference", label: "Customer", config: { ref: "customers", key: "customerId" }, }, { key: "productId", type: "reference", label: "Product", config: { ref: "products", key: "productId" }, }, { key: "quantity", type: "number", label: "Quantity" }, { key: "orderDate", type: "date", label: "Order Date" }, ], }, ], }; ``` ### Actions and Workflows This example configures a [Workbook](/core-concepts/workbooks) with three [Actions](/core-concepts/actions) that can be used to validate and enrich the data in the workbook. **A note about Actions:** Actions also require a listener to respond to the event published by clicking on them. For more, see [Using Actions](/guides/using-actions) ```javascript const workbookWithActions = { name: "Advanced Customer Import", sheets: [customerSheet, productsSheet], actions: [ { operation: "validate-all", mode: "foreground", label: "Validate All Data", description: "Run comprehensive validation on all sheets", primary: true, constraints: [ { type: "hasData", }, ], }, { operation: "enrich-data", mode: "background", label: "Enrich Customer Data", description: "Add company information and social profiles", constraints: [ { type: "hasData", }, ], }, { operation: "export-to-crm", mode: "foreground", label: "Export to CRM", description: "Send validated customers to Salesforce", constraints: [ { type: "hasAllValid", }, ], }, ], }; ``` ### Workbook Metadata For comprehensive metadata usage patterns, see our [metadata guide](/guides/utilizing-metadata). Add contextual information: ```javascript const metadataWorkbook = { name: "Q1 2024 Customer Import", description: "Quarterly customer data migration for Q1 2024", metadata: { version: "2.1", department: "Sales", projectId: "PROJ-2024-001", }, sheets: [customerSheet], }; ``` ### Workbook Namespaces Workbooks can be assigned [namespaces](/guides/namespaces-and-filters) to enable granular event filtering and isolation within the same space: ```javascript const workbook = { name: 'Employee Data Processing', namespace: 'staging', // Simple string namespace sheets: [/* your sheets */] }; ``` Namespaces allow you to apply different [listeners](/core-concepts/listeners) and processing logic to different types of workbooks within the same space. For detailed examples and patterns, see the [Namespaces and Filters guide](/guides/namespaces-and-filters). ## Folders Folders provide a simple way to organize Workbooks within a Space into named groupings in the Flatfile UI. Similar to organizing files in folders on your computer, you can group related Workbooks together for better organization and navigation. Folders have no functional impact on the data itself; their only purpose is to help you organize your Workbooks visually. ### Configuring Workbooks with Folders Assigning a Folder to a Workbook is as simple as adding a `folder` property to your Workbook [Blueprint](/core-concepts/blueprints) when [configuring your Space](/core-concepts/spaces#space-configuration) (or otherwise creating or updating a Workbook via the API). If you add the same folder to multiple Workbooks, they will be grouped together in the UI. The following example defines four Workbooks (**Marketing Data**, **Sales Report**, **Customer Onboarding**, and **Support Tickets**) split evenly between two folders (**Analytics** and **Customer Data**). Sheets also have an similar feature called [Collections](/core-concepts/sheets#collections), which you can use to group associated Sheets within a Workbook. You can think of **Folders** and **Collections** like a filing system: * **Folders** help you organize your Workbooks within a Space (like organizing binders on a shelf) * [Collections](/core-concepts/sheets#organizing-sheets-with-collections) help you organize Sheets within each Workbook (like organizing tabs within a binder). ```javascript // Analytics folder const marketingDataWorkbook = { name: "Marketing Data", folder: "Analytics", sheets: [campaignMetricsSheet, leadSourcesSheet] }; const salesReportWorkbook = { name: "Sales Report", folder: "Analytics", sheets: [salesDataSheet, revenueSheet] }; // Customer Data folder const onboardingWorkbook = { name: "Customer Onboarding", folder: "Customer Data", sheets: [newCustomersSheet, onboardingStepsSheet] }; const supportTicketsWorkbook = { name: "Support Tickets", folder: "Customer Data", sheets: [ticketsSheet, resolutionsSheet] }; ``` # Advanced Configuration Source: https://flatfile.com/docs/embedding/advanced-configuration Complete configuration reference for embedded Flatfile This reference covers all configuration options for embedded Flatfile, from basic setup to advanced customization. ## Authentication & Security ### publishableKey Your publishable key authenticates your application with Flatfile. This key is safe to include in client-side code. **Where to find it:** 1. Log into [Platform Dashboard](https://platform.flatfile.com) 2. Navigate to **Developer Settings** → **API Keys** 3. Copy your **Publishable Key** (starts with `pk_`) ```javascript // Example usage const config = { publishableKey: "pk_1234567890abcdef", // Your actual key }; ``` ### Security Best Practices #### Environment Variables Store your publishable key in environment variables rather than hardcoding: ```javascript // ✅ Good - using environment variable const config = { publishableKey: process.env.REACT_APP_FLATFILE_KEY, }; // ❌ Avoid - hardcoded keys const config = { publishableKey: "pk_1234567890abcdef", }; ``` ## Common Configuration Options These options are shared across all SDK implementations: ### Authentication | Option | Type | Required | Description | | ---------------- | ------ | -------- | -------------------------------------------- | | `publishableKey` | string | ✅ | Your publishable key from Platform Dashboard | ### User Identity | Option | Type | Required | Description | | ---------------------- | ------ | -------- | ------------------------------------------------------------------------------- | | `userInfo` | object | ❌ | User metadata for space creation | | `userInfo.userId` | string | ❌ | Unique user identifier | | `userInfo.name` | string | ❌ | User's display name - this is displayed in the dashboard as the associated user | | `userInfo.companyId` | string | ❌ | Company identifier | | `userInfo.companyName` | string | ❌ | Company display name | | `externalActorId` | string | ❌ | Unique identifier for embedded users | ### Space Setup | Option | Type | Required | Description | | --------------- | -------- | -------- | ----------------------------------------- | | `name` | string | ✅ | Name of the space | | `environmentId` | string | ✅ | Environment identifier | | `spaceId` | string | ❌ | ID of existing space to reuse | | `workbook` | object | ❌ | Workbook configuration for dynamic spaces | | `listener` | Listener | ❌ | Event listener for responding to events | ### Look & Feel | Option | Type | Required | Description | | --------------------------------- | ------- | -------- | -------------------------------------------------------------------- | | `themeConfig` | object | ❌ | Theme values for Space, sidebar and data table | | `spaceBody` | object | ❌ | Space options for creating new Space; used with Angular and Vue SDKs | | `sidebarConfig` | object | ❌ | Sidebar UI configuration | | `sidebarConfig.defaultPage` | object | ❌ | Landing page configuration | | `sidebarConfig.showDataChecklist` | boolean | ❌ | Toggle data config, defaults to false | | `sidebarConfig.showSidebar` | boolean | ❌ | Show/hide sidebar | | `document` | object | ❌ | Document content for space | | `document.title` | string | ❌ | Document title | | `document.body` | string | ❌ | Document body content (markdown) | ### Basic Behavior | Option | Type | Required | Description | | ---------------------- | -------- | -------- | ------------------------------------------ | | `closeSpace` | object | ❌ | Options for closing iframe | | `closeSpace.operation` | string | ❌ | Operation type | | `closeSpace.onClose` | function | ❌ | Callback when space closes | | `displayAsModal` | boolean | ❌ | Display as modal or inline (default: true) | ## Advanced Configuration Options These options provide specialized functionality for custom implementations: ### Space Reuse | Option | Type | Required | Description | | ------------- | ------ | -------- | --------------------------------------------- | | `id` | string | ✅ | Space ID | | `accessToken` | string | ✅ | Access token for space (obtained server-side) | **Important:** To reuse an existing space, you must retrieve the spaceId and access token server-side using your secret key, then pass the `accessToken` to the client. See [Server Setup Guide](./server-setup) for details. ### UI Overrides | Option | Type | Required | Description | | ------------------------- | ------------ | -------- | ---------------------------------------------------------------- | | `mountElement` | string | ❌ | Element to mount Flatfile (default: "flatfile\_iFrameContainer") | | `loading` | ReactElement | ❌ | Custom loading component | | `exitTitle` | string | ❌ | Exit dialog title (default: "Close Window") | | `exitText` | string | ❌ | Exit dialog text (default: "See below") | | `exitPrimaryButtonText` | string | ❌ | Primary button text (default: "Yes, exit") | | `exitSecondaryButtonText` | string | ❌ | Secondary button text (default: "No, stay") | | `errorTitle` | string | ❌ | Error dialog title (default: "Something went wrong") | ### On-Premises Configuration | Option | Type | Required | Description | | ---------- | ------ | -------- | ------------------------------------------------------------------------------------------------ | | `apiUrl` | string | ❌ | API endpoint (default: "[https://platform.flatfile.com/api](https://platform.flatfile.com/api)") | | `spaceUrl` | string | ❌ | Spaces API URL (default: "[https://platform.flatfile.com/s](https://platform.flatfile.com/s)") | URLs for other regions can be found [here](../reference/cli#regional-servers). ## Configuration Examples ### Basic Space Creation ```javascript const config = { publishableKey: "pk_1234567890abcdef", name: "Customer Data Import", environmentId: "us_env_abc123", workbook: { // your workbook configuration }, userInfo: { userId: "user_123", name: "John Doe", }, }; ``` ### Space Reuse with Access Token ```javascript // Client-side: Use space with access token from server const config = { space: { id: "us_sp_abc123def456", accessToken: "at_1234567890abcdef", // Retrieved server-side }, }; ``` ### Advanced UI Customization ```javascript const config = { publishableKey: "pk_1234567890abcdef", mountElement: "custom-flatfile-container", exitTitle: "Are you sure you want to leave?", exitText: "Your progress will be saved.", themeConfig: { // custom theme configuration }, }; ``` ## Troubleshooting ### Invalid publishableKey **Error:** `"Invalid publishable key"` **Solution:** * Verify key starts with `pk_` * Check for typos or extra spaces * Ensure key is from correct environment ### Space Not Found **Error:** `"Space not found"` or `403 Forbidden` **Solution:** * Verify Space ID format (`us_sp_` prefix) * Ensure Space exists and is active * Check Space permissions in dashboard ### CORS Issues **Error:** `"CORS policy blocked"` **Solution:** * Add your domain to allowed origins in Platform Dashboard * Ensure you're using publishable key (not secret key) * Check browser network tab for specific CORS errors ### Access Token Issues **Error:** `"Invalid access token"` when using space reuse **Solution:** * Ensure access token is retrieved server-side using secret key * Check that token hasn't expired * Verify space ID matches the token ## Testing Setup For development and testing: ```javascript // Development configuration const config = { publishableKey: "pk_test_1234567890abcdef", // publishable key from development environment }; ``` Create separate test Spaces for development to avoid affecting production data. ## Next Steps Once configured: * Deploy your event listener to Flatfile * Configure data validation and transformation rules * Test the embedding in your application * Deploy to production with production keys For server-side space reuse patterns, see our [Server Setup Guide](./server-setup). # Angular Embedding Source: https://flatfile.com/docs/embedding/angular Embed Flatfile in Angular applications Embed Flatfile in your Angular application using our Angular SDK. This provides Angular components and services for seamless integration. ## Installation ```bash npm install @flatfile/angular-sdk ``` ## Basic Implementation ### 1. Import the Module Add the `SpaceModule` to your Angular module: ```typescript import { NgModule } from "@angular/core"; import { SpaceModule } from "@flatfile/angular-sdk"; @NgModule({ imports: [ SpaceModule, // your other imports ], // ... }) export class AppModule {} ``` ### 2. Create Component Create a component to handle the Flatfile embed: ```typescript import { Component } from "@angular/core"; import { SpaceService, ISpace } from "@flatfile/angular-sdk"; @Component({ selector: "app-import", template: `

Welcome to our app

`, }) export class ImportComponent { constructor(private spaceService: SpaceService) {} spaceProps: ISpace = { publishableKey: "pk_your_publishable_key", displayAsModal: true, }; openFlatfile() { this.spaceService.OpenEmbed(this.spaceProps); } } ``` ### 3. Get Your Credentials **publishableKey**: Get from [Platform Dashboard](https://platform.flatfile.com) → Developer Settings **Authentication & Security**: For production applications, implement proper authentication and space management on your server. See [Advanced Configuration](./advanced-configuration) for authentication guidance. ## Complete Example The example below will open an empty space. To create the sheet your users should land on, you'll want to create a workbook as shown further down this page. ```typescript // app.module.ts import { NgModule } from "@angular/core"; import { BrowserModule } from "@angular/platform-browser"; import { SpaceModule } from "@flatfile/angular-sdk"; import { AppComponent } from "./app.component"; @NgModule({ declarations: [AppComponent], imports: [BrowserModule, SpaceModule], providers: [], bootstrap: [AppComponent], }) export class AppModule {} ``` ```typescript // app.component.ts import { Component } from "@angular/core"; import { SpaceService, ISpace } from "@flatfile/angular-sdk"; @Component({ selector: "app-root", template: `

My Application

`, }) export class AppComponent { constructor(private spaceService: SpaceService) {} spaceProps: ISpace = { publishableKey: "pk_your_publishable_key", displayAsModal: true, }; openFlatfile() { this.spaceService.OpenEmbed(this.spaceProps); } } ``` ## Creating New Spaces To create a new Space each time: 1. Add a `workbook` configuration object. Read more about workbooks [here](../core-concepts/workbooks). 2. Optionally [deploy](../core-concepts/listeners) a `listener` for custom data processing. Your listener will contain your validations and transformations ```typescript spaceProps: ISpace = { publishableKey: "pk_your_publishable_key", workbook: { name: "My Import", sheets: [ { name: "Contacts", slug: "contacts", fields: [ { key: "name", type: "string", label: "Name" }, { key: "email", type: "string", label: "Email" }, ], }, ], }, displayAsModal: true, }; ``` For detailed workbook configuration, see the [Workbook API Reference](https://reference.flatfile.com/api-reference/workbooks). ## Reusing Existing Spaces For production applications, implement proper space management on your server to ensure security and proper access control: ```typescript // Frontend Component @Component({ selector: "app-import", template: `
`, }) export class ImportComponent { loading = false; constructor(private spaceService: SpaceService, private http: HttpClient) {} async openFlatfile() { this.loading = true; try { // Get space credentials from your server const response = await this.http .get<{ publishableKey: string; spaceId: string; accessToken?: string; }>("/api/flatfile/space") .toPromise(); const spaceProps: ISpace = { space: { spaceId: response.spaceId, accessToken: response.accessToken, }, displayAsModal: true, }; this.spaceService.OpenEmbed(spaceProps); } catch (error) { console.error("Failed to load Flatfile space:", error); } finally { this.loading = false; } } } ``` For server implementation details, see the [Server Setup](/embedding/server-setup) guide. ## Configuration Options For detailed configuration options, authentication settings, and advanced features, see the [Advanced Configuration](./advanced-configuration) guide. ## Using Space Component Directly You can also use the `flatfile-space` component directly in your template: ```typescript @Component({ selector: "app-import", template: ` `, }) export class ImportComponent { showSpace = false; spaceProps: ISpace = { publishableKey: "pk_your_publishable_key", displayAsModal: true, }; toggleSpace() { this.showSpace = !this.showSpace; } onCloseSpace() { this.showSpace = false; } } ``` ## TypeScript Support The Angular SDK is built with TypeScript and includes full type definitions: ```typescript import { ISpace, SpaceService } from "@flatfile/angular-sdk"; interface ImportData { name: string; email: string; } @Component({ // component definition }) export class ImportComponent { spaceProps: ISpace; constructor(private spaceService: SpaceService) { this.spaceProps = { publishableKey: "pk_your_publishable_key", spaceId: "us_sp_your_space_id", }; } } ``` ## Next Steps * **Advanced Configuration**: Set up [authentication, listeners, and advanced options](./advanced-configuration) * **Server Setup**: Implement [backend integration and space management](./server-setup) * **Data Processing**: Set up Listeners in your Space for custom data transformations * **API Integration**: Use [Flatfile API](https://reference.flatfile.com) to retrieve processed data * **Angular SDK Documentation**: See [@flatfile/angular-sdk documentation](https://www.npmjs.com/package/@flatfile/angular-sdk) ## Quick Links Authentication, listeners, and advanced options Backend integration and space management ## Example Projects Complete Angular application with Flatfile embedding # JavaScript Embedding Source: https://flatfile.com/docs/embedding/javascript Embed Flatfile using vanilla JavaScript Embed Flatfile directly into any web application using our JavaScript SDK. This approach works with any framework or vanilla JavaScript setup. ## Quick Links * [Advanced Configuration](./advanced-configuration) - Complete configuration options reference * [Server Setup](./server-setup) - Server-side setup for space reuse ## Installation Add the Flatfile JavaScript SDK to your page: ```html ``` Or install via npm: ```bash npm install @flatfile/javascript ``` ## Basic Implementation ### 1. HTML Structure Create a button to trigger the Flatfile embed: ```html Data Import ``` ### 2. Initialize Flatfile ```javascript document.getElementById("import-btn").addEventListener("click", function () { window.FlatFileJavaScript.startFlatfile({}); }); ``` ### 3. Get Your Credentials **publishableKey**: Get from [Platform Dashboard](https://platform.flatfile.com) → Developer Settings ```javascript document.getElementById("import-btn").addEventListener("click", function () { window.FlatFileJavaScript.startFlatfile({ publishableKey: "pk_your_publishable_key", }); }); ``` For detailed authentication and security guidance, see [Advanced Configuration](./advanced-configuration). ## Complete Example The example below will open an empty space. To create the sheet your users should land on, you'll want to create a workbook as shown further down this page. ```html Data Import

Welcome to our app

``` ## Creating New Spaces To create a new Space each time: 1. Add a `workbook` configuration object. Read more about workbooks [here](../core-concepts/workbooks). 2. Optionally [deploy](../core-concepts/listeners) a `listener` for custom data processing. Your listener will contain your validations and transformations You can also deploy and create your workbook via a server-side listener as shown in the above link, rather than passing it in as a client-side property. ```javascript window.FlatFileJavaScript.startFlatfile({ publishableKey: "pk_your_publishable_key", workbook: { name: "My Import", sheets: [ { name: "Contacts", slug: "contacts", fields: [ { key: "name", type: "string", label: "Name" }, { key: "email", type: "string", label: "Email" }, ], }, ], }, }); ``` For detailed workbook configuration, see the [Workbook API Reference](https://reference.flatfile.com/api-reference/workbooks). ## Configuration Options For complete configuration options including user identity, theming, and advanced features, see [Advanced Configuration](./advanced-configuration). ## Reusing Existing Spaces To reuse an existing space, you need to set up a server-side component that retrieves the space access token. The basic client-side approach shown above only works for initial space creation. For space reuse, you'll need: 1. **Server-side setup**: Use your secret key to retrieve the space and its access token 2. **Client-side update**: Use the access token instead of just the publishable key ```javascript // Client-side: Get space data from your server const spaceData = await fetch("/api/spaces/us_sp_your_space_id").then((r) => r.json() ); // Use the space with access token window.FlatFileJavaScript.startFlatfile({ space: { id: spaceData.id, accessToken: spaceData.accessToken, // Retrieved from server }, }); ``` For complete server-side implementation, see [Server Setup](./server-setup). ## Next Steps * **Advanced Configuration**: See [Advanced Configuration](./advanced-configuration) for complete options reference * **Server Setup**: Set up [Server Setup](./server-setup) for space reuse functionality * **Styling**: Customize the embedded experience in your Platform Dashboard Space settings * **Data Processing**: Set up Listeners in your Space for custom data transformations * **API Integration**: Use [Flatfile API](https://reference.flatfile.com) to retrieve processed data ## Example Projects Complete working example with HTML and JavaScript # Overview Source: https://flatfile.com/docs/embedding/overview Embed Flatfile data import directly into your application Flatfile can be embedded directly into your web application, allowing users to import data without leaving your interface. This creates a seamless experience where data import feels like a native part of your app. ## When to Embed Flatfile Embed Flatfile when you want to: * **Seamless user experience** - Keep users within your application flow * **Custom branding** - Maintain your application's look and feel * **Integrated workflows** - Connect data import directly to your business logic * **Control over timing** - Trigger imports at the right moment in your user journey ## How Embedding Works Flatfile embedding uses an iframe-based approach that loads the Flatfile interface within your application. You provide credentials and configuration to load a pre-configured Space or create a new Space with your data schema and workflows. The embedded experience handles file upload, data mapping, validation, and transformation automatically based on your Space configuration. ## Available SDKs Choose the SDK that matches your frontend framework: Vanilla JavaScript for any web application React components and hooks Angular components and services Vue.js components and composables ## Getting Started 1. **Create a Space** in the [Platform Dashboard](https://platform.flatfile.com) with your data schema 2. **Get your credentials** from Developer Settings 3. **Choose your SDK** and follow the integration guide ## Configuration & Setup Complete configuration options reference Server-side setup for space reuse ## Two Approaches: Existing vs Dynamic Spaces These guides show both approaches: * **Existing Spaces**: Connect to a pre-existing Space * **Dynamic Spaces**: Create a new Space programmatically Choose the approach that best fits your workflow. See the implementation instructions for both patterns in each SDK guide. For advanced configuration options like themes, custom workflows, and data transformations, see [Advanced Configuration](./advanced-configuration) or configure these in your Space through the Platform Dashboard. # React Embedding Source: https://flatfile.com/docs/embedding/react Embed Flatfile in React applications Embed Flatfile in your React application using our React SDK. This provides React components and hooks for seamless integration. ## Installation ```bash npm install @flatfile/react ``` ## Basic Implementation ### 1. Wrap Your App Wrap your application with the `FlatfileProvider`: ```jsx import { FlatfileProvider } from "@flatfile/react"; function App() { return ( ); } ``` ### 2. Add Import Trigger Use the `useFlatfile` hook to open Flatfile: ```jsx import { useFlatfile } from "@flatfile/react"; function YourComponent() { const { openPortal } = useFlatfile(); const handleImport = () => { openPortal({}); }; return (

Welcome to our app

); } ``` ### 3. Get Your Credentials **publishableKey**: Get from [Platform Dashboard](https://platform.flatfile.com) → Developer Settings For authentication guidance including JWT tokens and user management, see [Advanced Configuration](./advanced-configuration). ## Complete Example The example below will open an empty space. To create the sheet your users should land on, you'll want to create a workbook as shown further down this page. ```jsx import React from "react"; import { FlatfileProvider, useFlatfile } from "@flatfile/react"; function ImportButton() { const { openPortal } = useFlatfile(); const handleImport = () => { openPortal({}); }; return ; } function App() { return (

My Application

); } export default App; ``` ## Configuration Options For all configuration options including authentication, theming, and advanced settings, see [Advanced Configuration](./advanced-configuration). ## Using Space Component For more control, you can use the `Space` component directly: ```jsx import { FlatfileProvider, Space } from "@flatfile/react"; function App() { return ( ); } ``` ## Creating New Spaces To create a new Space each time: 1. Add a `workbook` configuration object. Read more about workbooks [here](../core-concepts/workbooks). 2. Optionally [deploy](../core-concepts/listeners) a `listener` for custom data processing. Your listener will contain your validations and transformations ```jsx import { FlatfileProvider, Workbook } from "@flatfile/react"; const workbookConfig = { name: "My Import", sheets: [ { name: "Contacts", slug: "contacts", fields: [ { key: "name", type: "string", label: "Name" }, { key: "email", type: "string", label: "Email" }, ], }, ], }; function App() { return ( ); } ``` For detailed workbook configuration, see the [Workbook API Reference](https://reference.flatfile.com/api-reference/workbooks). ## Reusing Existing Spaces For production applications, you should create Spaces on your server and pass the Space ID to your React application: ### Server-side (Node.js/Express) ```javascript // Create Space on your server const space = await api.spaces.create({ name: "User Import", workbooks: [workbookConfig], }); // Pass spaceId to your React app res.json({ spaceId: space.data.id, token: space.data.accessToken }); ``` ### Client-side (React) ```jsx import { FlatfileProvider, useFlatfile } from "@flatfile/react"; function ImportComponent() { const { openPortal } = useFlatfile(); const [spaceId, setSpaceId] = useState(null); useEffect(() => { // Get Space ID from your server fetch("/api/create-space") .then((res) => res.json()) .then((data) => setSpaceId(data.spaceId)) .then((data) => setToken(data.token)); }, []); const handleImport = () => { if (spaceId) { openPortal({}); } }; return ; } ``` This approach allows you to: * Create Spaces with server-side authentication * Add custom event listeners and data processing * Control Space lifecycle and cleanup For complete server setup examples, see [Server Setup](/embedding/server-setup). ## TypeScript Support The React SDK includes full TypeScript support: ```tsx import { FlatfileProvider, useFlatfile } from "@flatfile/react"; interface Props { onDataImported?: (data: any) => void; } function ImportComponent({ onDataImported }: Props) { const { openPortal } = useFlatfile(); const handleImport = (): void => { openPortal({}); }; return ; } ``` ## Next Steps * **Advanced Configuration**: [Authentication, theming, and advanced options](./advanced-configuration) * **Server Setup**: [Backend integration and event handling](./server-setup) * **API Integration**: Use [Flatfile API](https://reference.flatfile.com) to retrieve processed data * **Package Documentation**: See [@flatfile/react documentation](https://www.npmjs.com/package/@flatfile/react) ## Quick Links Authentication, theming, and advanced options Backend integration and event handling ## Example Projects Complete React application with Flatfile embedding # Server Setup for Space Reuse Source: https://flatfile.com/docs/embedding/server-setup Server-side implementation guide for reusing existing spaces To reuse existing spaces with embedded Flatfile, you need a server-side component that retrieves space access tokens using your secret key. This guide shows you how to set up the server-side pattern correctly. ## Why Server-Side Setup is Required When reusing an existing space, you cannot use your publishable key directly. Instead, you must: 1. **Server-side**: Use your secret key to retrieve the space and its access token 2. **Client-side**: Use the access token (not publishable key) to launch the space This pattern ensures security by keeping your secret key on the server while providing the necessary access token to the client. ## Server-Side Implementation ### Environment Setup Create a `.env` file with your credentials: ```bash # .env FLATFILE_API_KEY=sk_1234567890abcdef # Your secret key SPACE_ID=us_sp_abc123def456 # Space ID to reuse BASE_URL=http://localhost:3000 # Your application URL ``` ### Basic Server Example Here's a Node.js/Express server that retrieves space access tokens: ```javascript // server.js import express from "express"; import cors from "cors"; import { FlatfileClient } from "@flatfile/api"; import dotenv from "dotenv"; dotenv.config(); const app = express(); const PORT = process.env.PORT || 3001; // Initialize Flatfile API with secret key const flatfile = new FlatfileClient({ token: process.env.FLATFILE_API_KEY, // Secret key }); // Enable CORS for your frontend app.use( cors({ origin: process.env.BASE_URL || "http://localhost:3000", }) ); app.use(express.json()); // Endpoint to retrieve space with access token app.get("/api/spaces/:spaceId", async (req, res) => { try { const { spaceId } = req.params; // Retrieve space using secret key const space = await flatfile.spaces.get(spaceId); // Return space data including access token res.json({ success: true, data: space.data, // Contains both id and accessToken }); } catch (error) { console.error("Error retrieving space:", error); res.status(500).json({ success: false, error: "Failed to retrieve space", }); } }); // Health check endpoint app.get("/health", (req, res) => { res.json({ status: "OK" }); }); app.listen(PORT, () => { console.log(`Server running on port ${PORT}`); }); ``` ### Enhanced Server with Multiple Spaces For applications that need to handle multiple spaces: ```javascript // enhanced-server.js import express from "express"; import cors from "cors"; import { FlatfileClient } from "@flatfile/api"; import dotenv from "dotenv"; dotenv.config(); const app = express(); const flatfile = new FlatfileClient({ token: process.env.FLATFILE_API_KEY, }); app.use( cors({ origin: process.env.BASE_URL || "http://localhost:3000", }) ); app.use(express.json()); // Get space by ID app.get("/api/spaces/:spaceId", async (req, res) => { try { const { spaceId } = req.params; const space = await flatfile.spaces.get(spaceId); res.json({ success: true, data: { id: space.data.id, name: space.data.name, accessToken: space.data.accessToken, }, }); } catch (error) { res.status(500).json({ success: false, error: "Failed to retrieve space", }); } }); // List available spaces app.get("/api/spaces", async (req, res) => { try { const spaces = await flatfile.spaces.list(); res.json({ success: true, data: spaces.data.map((space) => ({ id: space.id, name: space.name, createdAt: space.createdAt, })), }); } catch (error) { res.status(500).json({ success: false, error: "Failed to list spaces", }); } }); // Create new space (optional) app.post("/api/spaces", async (req, res) => { try { const { name, environmentId } = req.body; const space = await flatfile.spaces.create({ name, environmentId, // Additional space configuration }); res.json({ success: true, data: { id: space.data.id, name: space.data.name, accessToken: space.data.accessToken, }, }); } catch (error) { res.status(500).json({ success: false, error: "Failed to create space", }); } }); const PORT = process.env.PORT || 3001; app.listen(PORT, () => { console.log(`Server running on port ${PORT}`); }); ``` ## Client-Side Implementation ### Fetching Space Data Your client application should fetch the space data from your server: ```javascript // client.js async function getSpaceData(spaceId) { try { const response = await fetch(`/api/spaces/${spaceId}`); const result = await response.json(); if (!result.success) { throw new Error(result.error); } return result.data; // Contains id and accessToken } catch (error) { console.error("Failed to fetch space data:", error); throw error; } } ``` ### Using the Space Data Once you have the space data, use it to initialize Flatfile: ```javascript window.openFlatfile = () => { fetch(server_url + "/space") // Make a request to the server endpoint .then((response) => response.json()) .then((space) => { const flatfileOptions = { space: { id: space && space.data && space.data.id, accessToken: space && space.data && space.data.accessToken, }, sidebarConfig: { showSidebar: false, }, // Additional props... }; initializeFlatfile(flatfileOptions); }) .catch((error) => { console.error("Error retrieving space in client:", error); }); }; ``` ## Complete Workflow Example Here's how the complete workflow works: ### 1. Server Setup ```javascript // server.js app.get("/api/spaces/:spaceId", async (req, res) => { const space = await flatfile.spaces.get(req.params.spaceId); res.json({ data: space.data }); }); ``` ### 2. Client Request ```javascript // client.js const response = await fetch("/api/spaces/us_sp_abc123def456"); const { data } = await response.json(); ``` ### 3. Client Usage ```javascript const flatfileOptions = { space: { id: space && space.data && space.data.id, accessToken: space && space.data && space.data.accessToken, }, sidebarConfig: { showSidebar: false, }, // Additional props... }; initializeFlatfile(flatfileOptions); ``` ## Security Considerations ### Environment Variables Never expose your secret key in client-side code: ```javascript // ✅ Good - server-side only const flatfile = new FlatfileClient({ token: process.env.FLATFILE_API_KEY, // Secret key on server }); // ❌ Bad - never do this in client code const flatfile = new FlatfileClient({ token: "sk_1234567890abcdef", // Secret key exposed }); ``` ### CORS Configuration Configure CORS to only allow your frontend domain: ```javascript app.use( cors({ origin: process.env.BASE_URL, // Only your frontend credentials: true, }) ); ``` ### Access Token Handling * Access tokens are temporary and space-specific * They should be fetched fresh for each session * Store them securely in your client application ## Error Handling ### Server-Side Errors ```javascript app.get("/api/spaces/:spaceId", async (req, res) => { try { const space = await flatfile.spaces.get(req.params.spaceId); res.json({ success: true, data: space.data }); } catch (error) { if (error.status === 404) { res.status(404).json({ success: false, error: "Space not found", }); } else if (error.status === 403) { res.status(403).json({ success: false, error: "Access denied", }); } else { res.status(500).json({ success: false, error: "Server error", }); } } }); ``` ### Client-Side Errors ```javascript async function getSpaceData(spaceId) { try { const response = await fetch(`/api/spaces/${spaceId}`); if (!response.ok) { throw new Error(`HTTP ${response.status}: ${response.statusText}`); } const result = await response.json(); if (!result.success) { throw new Error(result.error); } return result.data; } catch (error) { console.error("Failed to fetch space data:", error); throw error; } } ``` ## Testing ### Local Development 1. Start your server: `node server.js` 2. Test the endpoint: `curl http://localhost:3001/api/spaces/us_sp_abc123def456` 3. Verify the response includes `id` and `accessToken` ### Production Deployment 1. Set environment variables on your server 2. Deploy your server application 3. Update your client to use the production server URL 4. Test the complete workflow ## Next Steps Once your server is set up: * Update your client applications to use the server endpoint * Test the space reuse functionality * Monitor server logs for any issues * Consider adding authentication for your server endpoints For client-side SDK implementation, see the framework-specific guides: * [JavaScript SDK](./javascript) * [React SDK](./react) * [Vue SDK](./vue) * [Angular SDK](./angular) # Vue Embedding Source: https://flatfile.com/docs/embedding/vue Embed Flatfile in Vue.js applications Embed Flatfile in your Vue.js application using our Vue SDK. This provides Vue components and composables for seamless integration. ## Installation ```bash npm install @flatfile/vue ``` ## Basic Implementation ### 1. Initialize Flatfile Use the `initializeFlatfile` composable in your Vue component: ```vue ``` ### 2. Get Your Credentials **publishableKey**: Get from [Platform Dashboard](https://platform.flatfile.com) → Developer Settings For authentication and security guidance, see [Advanced Configuration](./advanced-configuration). ## Complete Example The example below will open an empty space. To create the sheet your users should land on, you'll want to create a workbook as shown further down this page. ```vue ``` ## Options API Syntax If you prefer the Options API: ```vue ``` ## Creating New Spaces To create a new Space each time: 1. Add a `workbook` configuration object. Read more about workbooks [here](../core-concepts/workbooks). 2. Optionally [deploy](../core-concepts/listeners) a `listener` for custom data processing. Your listener will contain your validations and transformations ```vue ``` For detailed workbook configuration, see the [Workbook API Reference](https://reference.flatfile.com/api-reference/workbooks). ## Reusing Existing Spaces When reusing existing Spaces, the proper pattern is to let your server handle Space creation and management: ```vue ``` For server-side Space creation and management patterns, see [Server Setup](/embedding/server-setup). ## TypeScript Support The Vue SDK supports TypeScript: ```vue ``` ## Styling Add custom styles to integrate with your Vue application: ```vue ``` ## Configuration Options For complete configuration options including authentication, theming, and advanced settings, see [Advanced Configuration](./advanced-configuration). ## Next Steps * **Authentication**: Set up secure authentication with [Advanced Configuration](./advanced-configuration) * **Server Setup**: Configure backend data processing with [Server Setup](./server-setup) * **Styling**: Customize the embedded experience in your Platform Dashboard Space settings * **API Integration**: Use [Flatfile API](https://reference.flatfile.com) to retrieve processed data ## Quick Links Authentication, theming, and advanced options Backend integration and data processing ## Example Projects Complete Vue.js application with Flatfile embedding # Getting Started with AutoBuild Source: https://flatfile.com/docs/getting-started/quickstart/autobuild Get up and running with Flatfile in minutes using AutoBuild to create a complete data import solution ## What is AutoBuild? The easiest way to get started with Flatfile is using AutoBuild. With AutoBuild, you can transform existing import templates or documentation into a fully functional Flatfile app in minutes. Simply drop your example files into AutoBuild, and it will automatically create and deploy a [Blueprint](/core-concepts/blueprints) (for schema definition) and a [Listener](/core-concepts/listeners) (for validations and transformations) to your Flatfile [App](/core-concepts/apps). Once you've started with AutoBuild, you can always download your Listener code and continue building with code from there! ## Setting Up Your Account To get started, you'll need to [sign up for a Flatfile account](https://platform.flatfile.com/oauth/login). During account setup, enter your company name and select "Start with an existing template or project file." If you already have an active Flatfile account, you can still use AutoBuild to create a new app. From the Flatfile dashboard, click the "New App" button. Then select "Build with AutoBuild." If the AutoBuild option isn't available on your account, please reach out to support via [Slack](https://flatfile.com/join-slack/) or [Email](mailto:support@flatfile.com) to gain access!{" "} ## Uploading Files and Context Next, you'll upload files and provide additional context to the AutoBuild agent. You can upload any of the following to help the AI understand your requirements: * Import templates * System documentation * Complete data files * Any other files that provide useful context You may also provide an additional prompt to guide the AutoBuild agent. Use this to give context about your uploaded files, explain specific data challenges, or outline additional requirements. When you're ready, click "Get Started." The Flatfile AutoBuild agent will now build your space template. ## Working in Build Mode After a few moments, you'll be taken to your new Flatfile app in Build Mode, which you can access anytime to make changes. On the right side, you'll see the blueprint of your space. Here you can inspect and edit the sheets and fields that the AutoBuild agent has generated. You can easily add or remove fields, update constraints and validations, or make other basic edits to your blueprint. For more advanced changes, you can chat with the Flatfile Assistant. The Assistant can help you with anything from small tweaks to complex validations, data egress actions, or large reorganization of your sheets. At any point, you can check the Data Preview tab to see what your Flatfile project will look like for your users. You can add or edit data to test your validations and transformations. ## Deploying Your App When you're finished building your space, click "Configure & Deploy." You'll be prompted to give your app a name, and then it's ready to be deployed! From here, you'll be taken to your new app in the dashboard. Your autobuild agent is deployed and you're ready to create your first project and start importing data! # Creating Your First Project Source: https://flatfile.com/docs/getting-started/quickstart/first-project Running through the import flow with your Flatfile App If you haven't created an app yet, start with the [Building Your First App with AutoBuild guide](/getting-started/quickstart/autobuild). ## Understanding Projects With your app created, it's time to create your first data import project! A project is an instance of your app - think of your app as a template for onboarding customers, and you'll create a project for each customer you onboard. Each project gives you an isolated workspace with its own database, permissions, and workflow. ## Creating a New Project To create a project, click the create button in the upper right corner of the app page in your dashboard. The text in this button will depend on how you've named your app's entity. The AutoBuild agent will create a new space based on your app's template. You'll be taken to your new space, ready to receive data. If you'd like, you can manually import data using Flatfile's intuitive spreadsheet interface. ## Uploading and Mapping Data Most users will start by uploading a data file. You can drag and drop a file from your computer directly onto the sheet interface. The file will automatically be extracted and you'll be taken to the mapping interface. Here you can map columns from your uploaded file to fields in your source sheet. With your mappings in place, click "Continue." The data from your file will be mapped into your sheet. ## Learn More Now that you've created your first project, explore these helpful resources: * [Core Concepts](/core-concepts/overview) - Understand Flatfile's fundamental building blocks * [Handling Data](/guides/using-record-hooks) - Advanced data transformation and validation techniques * [Using Actions](/guides/using-actions) - Create custom workflows and automations * [API Reference](/api-reference) - Complete technical documentation ## Working with Your Data From here, any transformations and validations you've defined will run automatically. You can resolve any data issues in your sheet using AI transforms, find and replace, custom actions, or by simply editing the data manually. You can collaborate with others on data issues using comments and data clips. ## Exporting Your Data When you're ready to move your perfected data out of Flatfile, you've got options! You can download your data directly from the sheets interface. You can retrieve the sheet data via API. Or, you can create a custom action to ship the data directly into your system. # Accepting Additional Fields Source: https://flatfile.com/docs/guides/accepting-additional-fields Create additional fields on the fly The `allowAdditionalFields` feature offers a fluid integration experience, allowing users to effortlessly map to new or unconfigured fields in your Blueprints. ## How it works * By enabling `allowAdditionalFields`, your Sheet isn't restricted to the initial configuration. It can adapt to include new fields, whether they're anticipated or not. * These supplementary fields can either be added through API calls or input directly by users during the file import process. * To ensure clarity, any field that wasn't part of the original Blueprint configuration is flagged with a `treatment` property labeled `user-defined`. * When adding a custom field, there's no need to fuss over naming the field. The system intuitively adopts the header name from the imported file, streamlining the process. In essence, the `allowAdditionalFields` feature is designed for scalability and ease, ensuring your Blueprints are always ready for unexpected data fields. ## Example Blueprint w/ `allowAdditionalFields` ```json { "sheets": [ { "name": "Contacts", "slug": "contacts", "allowAdditionalFields": true, "fields": [ { "key": "firstName", "label": "First Name", "type": "string" }, { "key": "lastName", "label": "Last Name", "type": "string" }, { "key": "email", "label": "Email", "type": "string" } ] } ] } ``` # Advanced Filters Source: https://flatfile.com/docs/guides/advanced-filters Learn how to use Flatfile's Advanced Filters to efficiently filter and search through your data Advanced Filters in Flatfile provide a powerful way to filter and search through your data with complex conditions. This feature allows you to create sophisticated filter combinations to quickly find the exact records you need. ## Overview The Advanced Filters feature enables you to: * Create multiple filter conditions with different fields * Combine conditions using logical operators (AND/OR) * Filter by various data types with appropriate operators * Save and reuse filter combinations * Apply filters to large datasets efficiently ## Using Advanced Filters ### Accessing Advanced Filters You can access Advanced Filters in the Flatfile interface through the Filter button in the sheet toolbar: 1. Navigate to any sheet in your workbook 2. Click the "Filter" button in the toolbar 3. Select a field to filter by, or click "Advanced filter" to create a complex filter ### Creating Filter Conditions Each filter condition consists of three parts: 1. **Field** - The column you want to filter on 2. **Operator** - The comparison type (equals, contains, greater than, etc.) 3. **Value** - The specific value to filter by For example, you might create a filter like: `firstName is "John"` or `age > 30`. ### Combining Multiple Filters Advanced Filters allow you to combine multiple conditions: 1. Create your first filter condition 2. Click the "Add condition" button 3. Select whether to join with "AND" or "OR" logic 4. Add your next condition This allows for complex queries like: `firstName is "John" AND age > 30` or `status is "pending" OR status is "review"`. ### Available Operators Different field types support different operators: | Field Type | Available Operators | | ---------- | ----------------------------------------------- | | String | is, is not, like, is empty, not empty | | Number | is, is not, >, \<, >=, \<=, is empty, not empty | | Boolean | is true, is false, is empty, not empty | | Date | is, is not, >, \<, >=, \<=, is empty, not empty | | Enum | is, is not, is empty, not empty | ### Horizontal Scrolling When you add multiple filter conditions that extend beyond the available width of the screen, the filter area will automatically enable horizontal scrolling. This allows you to create complex filter combinations without being limited by screen space. Simply scroll horizontally to see all your filter conditions when they extend beyond the visible area. ## Advanced Filter Examples Here are some examples of how you might use Advanced Filters: ### Example 1: Finding Specific Customer Records ``` firstName is "Sarah" AND status is "active" AND lastPurchase > "2023-01-01" ``` This filter would show all active customers named Sarah who made a purchase after January 1, 2023. ### Example 2: Identifying Records Needing Attention ``` (status is "pending" OR status is "review") AND createdDate < "2023-06-01" ``` This filter would show all records that are either pending or in review, and were created before June 1, 2023. ### Example 3: Finding Missing Data ``` email is not empty AND phone is empty ``` This filter would show all records that have an email address but are missing a phone number. ## Best Practices * **Start simple**: Begin with a single filter condition and add more as needed * **Use AND/OR strategically**: "AND" narrows results (both conditions must be true), while "OR" broadens results (either condition can be true) * **Consider performance**: Very complex filters on large datasets may take longer to process * **Save common filters**: If you frequently use the same filter combinations, consider saving them as views ## Troubleshooting If you encounter issues with Advanced Filters: * Ensure your filter values match the expected format for the field type * Check that you're using appropriate operators for each field type * For complex filters, try breaking them down into simpler components to identify issues * Verify that the data you're filtering actually exists in your dataset Advanced Filters provide a powerful way to work with your data in Flatfile, allowing you to quickly find and focus on the records that matter most to your workflow. # Authentication and Authorization Source: https://flatfile.com/docs/guides/authentication Complete guide to authenticating with Flatfile using API keys, Personal Access Tokens, and managing roles and permissions This guide covers all aspects of authentication with Flatfile, including API keys, Personal Access Tokens, and role-based access control for your team and customers. ## API Keys Flatfile provides two different kinds of environment-specific API keys you can use to interact with the API. In addition, you can work with a development key or a production environment. API keys are created automatically. Use the [API Keys and Secrets](https://platform.flatfile.com/dashboard/keys-and-secrets) page to see your API keys for any given Environment. ### Testing and Development [Environments](/core-concepts/environments) are isolated entities and are intended to be a safe place to create and test different configurations. A `development` and `production` environment are created by default. | isProd | Name | Description | | ------- | ------------- | ------------------------------------------------------------------------------------------- | | *false* | `development` | Use this default environment, and its associated test API keys, as you build with Flatfile. | | *true* | `production` | When you're ready to launch, create a new environment and swap out your keys. | The development environment does not count towards your paid credits. ### Secret and Publishable Keys All Accounts have two key types for each environment. Learn when to use each type of key: | Type | Id | Description | | --------------- | ---------------------- | ----------------------------------------------------------------------------------------------------------------------- | | Secret key | `sk_23ghsyuyshs7dcrty` | **On the server-side:** Store this securely in your server-side code. Don't expose this key in an application. | | Publishable key | `pk_23ghsyuyshs7dcert` | **On the client-side:** Can be publicly-accessible in your application's client-side code. Use when embedding Flatfile. | The `accessToken` provided from `publishableKey` will remain valid for a duration of 24 hours. ## Personal Access Tokens Personal Access Tokens (PATs) provide a secure way to authenticate with the Flatfile API. Unlike environment-specific API keys, PATs are user-scoped tokens that inherit the permissions of the user who created them. Personal Access Tokens: * Are user-scoped authentication tokens * Have the same auth scope as the user who created them * Can be used in place of a JWT for API authentication * Are ideal for scripts, automation, and integrations that need to act on behalf of a user This opens up possibilities for various use cases, including building audit logs, managing Spaces, and monitoring agents across environments. ### Creating a Token 1. Log in to your Flatfile account 2. Click on your user profile dropdown in the top-right corner 3. Select "Personal Access Tokens" 4. Click "Create Token" 5. Enter a descriptive name for your token 6. Copy the generated token immediately - it will only be shown once Make sure to copy your token when it's first created. For security reasons, you won't be able to view the token again after leaving the page. ### Exchanging Credentials for an Access Token You can exchange your email and password credentials for an access token using the auth endpoint. See the [Authentication Examples](/guides/deeper/auth-examples#creating-a-pat-via-api) for the complete API call. The response will include an access token that you can use for API authentication. ### Retrieving a Personal Access Token (Legacy Method) Your `publishableKey` and `secretKey` are specific to an environment. Therefore, to interact at a higher level, you can use a personal access token. 1. From the dashboard, open **Settings** 2. Click to **Personal Tokens** 3. Retrieve your `clientId` and `secret`. 4. Using the key pair, call the auth endpoint. See the [Authentication Examples](/guides/deeper/auth-examples#legacy-client-credentials-flow) for the complete API call. 5. The response will include an `accessToken`. Present that as your **Bearer `token`** in place of the `secretKey`. ### Using a Token Use your Personal Access Token in API requests by including it in the Authorization header as documented in the [API Reference](https://reference.flatfile.com). ### Managing Tokens You can view all your active tokens in the Personal Access Tokens page. For each token, you can see: * Name * Creation date * Last used date (if applicable) To delete a token: 1. Navigate to the Personal Access Tokens page 2. Find the token you want to delete 3. Click the menu icon (three dots) next to the token 4. Select "Delete" 5. Confirm the deletion Deleting a token immediately revokes access for any applications or scripts using it. Make sure you update any dependent systems before deleting a token. ### Best Practices * Create separate tokens for different applications or use cases * Use descriptive names that identify where the token will be used * Regularly review and delete unused tokens * Rotate tokens periodically for enhanced security * Never share your tokens with others - each user should create their own tokens ### Example Use Cases #### Building an Audit Log Query for all events across all environments and combine them with user and guest data to create a comprehensive audit log, providing a detailed history of actions within the application. #### Managing Spaces Across Environments Determine the number of Spaces available and identify which Spaces exist in different environments, allowing you to efficiently manage and organize your data. #### Monitoring Agents Across Environments Keep track of agents deployed to various environments by retrieving information about their presence, ensuring smooth and efficient data import processes. ## Roles & Permissions Grant your team and customers access with role-based permissions. ### Administrator Roles Administrator roles have full access to your accounts, including inviting additional admins and seeing developer keys. The `accessToken` provided will remain valid for a duration of 24 hours. | Role | Details | | ------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Administrator | This role is meant for any member of your team who requires full access to the Account.

✓ Can add other administrators
✓ Can view secret keys
✓ Can view logs | ### Guest Roles Guest roles receive access via a magic link or a shared link depending on the [Environment](https://platform.flatfile.com/dashboard) `guestAuthentication` type. Guests roles can invite other Guests unless you turn off this setting in the [Guest Sidebar](/guides/customize-guest-sidebar). The `accessToken` provided will remain valid for a duration of 1 hour. #### Space Grant | Role | Details | | ------------------ | -------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Single-Space Guest | This role is meant for a guest who has access to only one Space. Such guests can be invited to additional Spaces at any time. | | Multi-Space Guest | This role is meant for a guest who has access to multiple Spaces. They will see a drop-down next to the Space name that enables them to switch between Spaces. | #### Workbook Grant | Role | Details | | --------------------- | ------------------------------------------------------------------------------------------ | | Single-Workbook Guest | This role is meant for a guest who should have access to only one Workbook within a Space. | | Multi-Workbook Guest | This role is intended for a guest who has access to multiple Workbooks within a Space. | This role can only be configured using code. See code example. ```js const createGuest = await api.guests.create({ environmentId: "us_env_hVXkXs0b", email: "guest@example.com", name: "Mr. Guest", spaces: [ { id: "us_sp_DrdXetPN", workbooks: [ { id: "us_wb_qGZbKwDW", }, ], }, ], }); ``` #### Guest Lifecycle When a guest user is deleted, all their space connections are automatically removed to ensure security. This means: * The guest loses access to all previously connected spaces * They cannot regain access to these spaces without being explicitly re-invited This automatic cleanup ensures that deleted guests cannot retain any access to spaces, even if they are later recreated with the same email address. ## API Reference For detailed API documentation on authentication endpoints, see the [Authentication API Reference](https://reference.flatfile.com/api-reference/auth). For programmatic management of Personal Access Tokens, see the [Personal Access Tokens API Reference](https://reference.flatfile.com/api-reference/auth/personal-access-tokens). # Custom Extractors Source: https://flatfile.com/docs/guides/custom-extractors Build custom file processing plugins to handle unique data formats and transform files into structured data ## What Are Custom Extractors? Custom extractors are specialized plugins that enable you to handle file formats that aren't natively supported by Flatfile's existing [plugins](/plugins). They process uploaded files, extract structured data, and provide that data for mapping into [Sheets](/core-concepts/sheets) as [Records](/core-concepts/records). This guide covers everything you need to know to build custom extractors. Common use cases include: * Legacy system data exports (custom delimited files, fixed-width formats) * Industry-specific formats (healthcare, finance, manufacturing) * Multi-format processors (handling various formats in one extractor) * Binary file handlers (images with metadata, proprietary formats) ## Architecture Overview ### Core Components Custom extractors are built using the `@flatfile/util-extractor` utility, which provides a standardized framework for file processing: ```javascript import { Extractor } from "@flatfile/util-extractor"; export const MyCustomExtractor = (options = {}) => { return Extractor(".myformat", "custom", myCustomParser, options); }; ``` Once you've created your extractor, you must register it in a [listener](/core-concepts/listeners) to be used. This will ensure that the extractor responds to the `file:created` [event](/reference/events#file%3Acreated) and processes your files. ```javascript // . . . other imports import { MyCustomExtractor } from "./my-custom-extractor"; export default function (listener) { // . . . other listener setup listener.use(MyCustomExtractor()); } ``` ### Handling Multiple File Extensions To support multiple file extensions, use a RegExp pattern: ```javascript // Support both .pipe and .custom extensions export const MultiExtensionExtractor = (options = {}) => { return Extractor(/\.(pipe|custom)$/i, "pipe", parseCustomFormat, options); }; // Support JSON variants export const JSONExtractor = (options = {}) => { return Extractor(/\.(json|jsonl|jsonlines)$/i, "json", parseJSONFormat, options); }; ``` ### Key Architecture Elements | Component | Purpose | Required | | ------------------- | -------------------------------------------------------------- | -------- | | **File Extension** | String or RegExp of supported file extension(s) | ✓ | | **Extractor Type** | String identifier for the extractor type | ✓ | | **Parser Function** | Core logic that converts file buffer to structured data | ✓ | | **Options** | Configuration for chunking, parallelization, and customization | - | ### Data Flow 1. **File Upload** → Flatfile receives file with matching extension 2. **Event Trigger** → `file:created` [event](/reference/events#file%3Acreated) fires 3. **Parser Execution** → Your parser function processes the file buffer 4. **Data Structuring** → Raw data is converted to WorkbookCapture format and provided to Flatfile for mapping into [Sheets](/core-concepts/sheets) as [Records](/core-concepts/records) 5. **Job Completion** → Processing status is reported to user ## Getting Started Remember that custom extractors are powerful tools for handling unique data formats. Start with simple implementations and gradually add complexity as needed. ### Prerequisites Install the required packages. You may also want to review our [Coding Tutorial](/coding-tutorial/overview) if you haven't created a [Listener](/core-concepts/listeners) yet. ```bash npm install @flatfile/util-extractor @flatfile/listener @flatfile/api ``` ### Basic Implementation Let's create a simple custom extractor for a pipe-delimited format. This will be used to process files with the `.pipe` or `.psv` extension that look like this: ```psv name|email|phone John Doe|john@example.com|123-456-7890 Jane Smith|jane@example.com|098-765-4321 ``` ```javascript import { Extractor } from "@flatfile/util-extractor"; // Parser function - converts Buffer to WorkbookCapture function parseCustomFormat(buffer) { const content = buffer.toString('utf-8'); const lines = content.split('\n').filter(line => line.trim()); if (lines.length === 0) { throw new Error('Empty file'); } // First line contains headers const headers = lines[0].split('|').map(h => h.trim()); // Remaining lines contain data const data = lines.slice(1).map(line => { const values = line.split('|').map(v => v.trim()); const record = {}; headers.forEach((header, index) => { record[header] = { value: values[index] || '' }; }); return record; }); return { Sheet1: { headers, data } }; } // Create the extractor export const CustomPipeExtractor = (options = {}) => { return Extractor(/\.(pipe|psv)$/i, "pipe", parseCustomFormat, options); }; ``` And now let's import and register it in your [Listener](/core-concepts/listeners). ```javascript // . . . other imports import { CustomPipeExtractor } from "./custom-pipe-extractor"; export default function (listener) { // . . . other listener setup listener.use(CustomPipeExtractor()); } ``` That's it! Your extractor is now registered and will be used to process pipe-delimited files with the `.pipe` or `.psv` extension. ## Advanced Examples ### Multi-Sheet Parser Let's construct an Extractor to handle files that contain multiple data sections. This will be used to process files with the `.multi` or `.sections` extension that look like this: ```text ---SECTION--- SHEET:Sheet1 name,email,phone John Doe,john@example.com,123-456-7890 Jane Smith,jane@example.com,098-765-4321 ---SECTION--- SHEET:Sheet2 name,email,phone Jane Doe,jane@example.com,123-456-7891 John Smith,john@example.com,098-765-4322 ---SECTION--- ``` ```javascript function parseMultiSheetFormat(buffer) { const content = buffer.toString('utf-8'); const sections = content.split('---SECTION---'); const workbook = {}; sections.forEach((section, index) => { if (!section.trim()) return; const lines = section.trim().split('\n'); const sheetName = lines[0].replace('SHEET:', '').trim() || `Sheet${index + 1}`; const headers = lines[1].split(',').map(h => h.trim()); const data = lines.slice(2).map(line => { const values = line.split(',').map(v => v.trim()); const record = {}; headers.forEach((header, idx) => { record[header] = { value: values[idx] || '' }; }); return record; }); workbook[sheetName] = { headers, data }; }); return workbook; } export const MultiSheetExtractor = (options = {}) => { return Extractor(/\.(multi|sections)$/i, "multi-sheet", parseMultiSheetFormat, options); }; ``` Now let's register it in your [Listener](/core-concepts/listeners). ```javascript // . . . other imports import { MultiSheetExtractor } from "./multi-sheet-extractor"; export default function (listener) { // . . . other listener setup listener.use(MultiSheetExtractor()); } ``` ### Binary Format Handler This example will be used to process binary files with structured data. This will be used to process binary files with the `.bin` or `.dat` extension. Due to the nature of binary format, we can't easily present a sample import here. {/* TODO: Add a sample import here. I can't figure out how to get Mintlify to respect asset download links. */} ```javascript function parseBinaryFormat(buffer) { // Example: Custom binary format with header + records let offset = 0; // Read header (first 16 bytes) const magic = buffer.readUInt32LE(offset); offset += 4; const version = buffer.readUInt16LE(offset); offset += 2; const recordCount = buffer.readUInt32LE(offset); offset += 4; const fieldCount = buffer.readUInt16LE(offset); offset += 2; if (magic !== 0xDEADBEEF) { throw new Error('Invalid file format'); } // Read field definitions const headers = []; for (let i = 0; i < fieldCount; i++) { const nameLength = buffer.readUInt16LE(offset); offset += 2; const name = buffer.toString('utf-8', offset, offset + nameLength); offset += nameLength; const type = buffer.readUInt8(offset); offset += 1; headers.push(name); } // Read records const data = []; for (let i = 0; i < recordCount; i++) { const record = {}; headers.forEach(header => { const valueLength = buffer.readUInt16LE(offset); offset += 2; const value = buffer.toString('utf-8', offset, offset + valueLength); offset += valueLength; record[header] = { value }; }); data.push(record); } return { Sheet1: { headers, data } }; } export const BinaryExtractor = (options = {}) => { return Extractor(/\.(bin|dat)$/i, "binary", parseBinaryFormat, options); }; ``` And, once again, let's register it in your [Listener](/core-concepts/listeners). ```javascript // . . . other imports import { BinaryExtractor } from "./binary-extractor"; export default function (listener) { // . . . other listener setup listener.use(BinaryExtractor()); } ``` ### Configuration-Driven Extractor Create a flexible extractor that can be configured for different formats. This will be used to process files in a manner that handles different delimiters, line endings, and other formatting options. ```javascript function createConfigurableParser(config) { return function parseConfigurableFormat(buffer) { const content = buffer.toString(config.encoding || 'utf-8'); let lines = content.split(config.lineDelimiter || '\n'); // Skip header lines if specified if (config.skipLines) { lines = lines.slice(config.skipLines); } // Filter empty lines if (config.skipEmptyLines) { lines = lines.filter(line => line.trim()); } if (lines.length === 0) { throw new Error('No data found'); } // Extract headers let headers; let dataStartIndex = 0; if (config.explicitHeaders) { headers = config.explicitHeaders; } else { headers = lines[0].split(config.fieldDelimiter || ',').map(h => h.trim()); dataStartIndex = 1; } // Process data const data = lines.slice(dataStartIndex).map(line => { const values = line.split(config.fieldDelimiter || ','); const record = {}; headers.forEach((header, index) => { let value = values[index] || ''; // Apply transformations if (config.transforms && config.transforms[header]) { value = config.transforms[header](value); } // Type conversion if (config.typeConversion) { if (!isNaN(value) && value !== '') { value = Number(value); } else if (value.toLowerCase() === 'true' || value.toLowerCase() === 'false') { value = value.toLowerCase() === 'true'; } } record[header] = { value }; }); return record; }); return { [config.sheetName || 'Sheet1']: { headers, data } }; }; } export const ConfigurableExtractor = (userConfig = {}) => { const defaultConfig = { encoding: 'utf-8', lineDelimiter: '\n', fieldDelimiter: ',', skipLines: 0, skipEmptyLines: true, typeConversion: false, sheetName: 'Sheet1' }; const config = { ...defaultConfig, ...userConfig }; return Extractor( config.fileExtension || ".txt", "configurable", createConfigurableParser(config), { chunkSize: config.chunkSize || 10000, parallel: config.parallel || 1 } ); }; ``` Now let's register two different configurable extractors in our [Listener](/core-concepts/listeners). The first will be used to process files with the `.custom` extension that look like this, while transforming dates and amount values: ```text Extraneous text More extraneous text name & date & amount John Doe & 1/1/2021 & 100.00 Jane Smith & 1/2/2021 & 200.00 ``` The second will be used to process files with the `.pipe` or `.special` extension that look like this: ```text Extraneous text More extraneous text name|date|amount John Doe|2021-01-01|100.00 Jane Smith|2021-01-02|200.00 ``` ```javascript // . . . other imports import { ConfigurableExtractor } from "./configurable-extractor"; export default function (listener) { // . . . other listener setup // Custom extractor with configuration for .custom files listener.use(ConfigurableExtractor({ fileExtension: ".custom", fieldDelimiter: " & ", skipLines: 2, typeConversion: true, transforms: { 'date': (value) => new Date(value).toISOString(), 'amount': (value) => parseFloat(value).toFixed(2) } })); // Custom extractor with configuration for .pipe and .special files listener.use(ConfigurableExtractor({ fileExtension: /\.(pipe|special)$/i, fieldDelimiter: "|", skipLines: 2, typeConversion: true })); } ``` ## Reference ### API ```typescript function Extractor( fileExt: string | RegExp, extractorType: string, parseBuffer: ( buffer: Buffer, options: any ) => WorkbookCapture | Promise, options?: Record ): (listener: FlatfileListener) => void ``` | Parameter | Type | Description | | --------------- | --------------------- | -------------------------------------------------------------------------- | | `fileExt` | `string` or `RegExp` | File extension to process (e.g., `".custom"` or `/\.(custom\|special)$/i`) | | `extractorType` | `string` | Identifier for the extractor type (e.g., "custom", "binary") | | `parseBuffer` | `ParserFunction` | Function that converts Buffer to WorkbookCapture | | `options` | `Record` | Optional configuration object | #### Options | Option | Type | Default | Description | | ----------- | --------- | ------- | -------------------------------------- | | `chunkSize` | `number` | `5000` | Records to process per batch | | `parallel` | `number` | `1` | Number of concurrent processing chunks | | `debug` | `boolean` | `false` | Enable debug logging | #### Parser Function Options Your `parseBuffer` function receives additional options beyond what you pass to `Extractor`: | Option | Type | Description | | ------------------------ | --------- | ------------------------------------------------- | | `fileId` | `string` | The ID of the file being processed | | `fileExt` | `string` | The file extension (e.g., ".csv") | | `headerSelectionEnabled` | `boolean` | Whether header selection is enabled for the space | ### Data Structures #### WorkbookCapture Structure The parser function must return a `WorkbookCapture` object: ```javascript const workbookCapture = { "SheetName1": { headers: ["field1", "field2", "field3"], data: [ { field1: { value: "value1" }, field2: { value: "value2" }, field3: { value: "value3" } }, // ... more records ] }, "SheetName2": { headers: ["col1", "col2"], data: [ { col1: { value: "data1" }, col2: { value: "data2" } } ] } }; ``` #### Cell Value Objects Each cell value should use the `Flatfile.RecordData` format: ```javascript const recordData = { field1: { value: "john@example.com" }, field2: { value: "John Doe" }, field3: { value: "invalid-email", messages: [ { type: "error", message: "Invalid email format" } ] } }; ``` #### Message Types | Type | Description | UI Effect | | --------- | --------------------- | -------------------------------------------------------------------------------------------- | | `error` | Validation error | Red highlighting, blocks [Actions](/core-concepts/actions) with the `hasAllValid` constraint | | `warning` | Warning message | Yellow highlighting, allows submission | | `info` | Informational message | Mouseover tooltip, allows submission | ### TypeScript Interfaces ```typescript type ParserFunction = ( buffer: Buffer, options: any ) => WorkbookCapture | Promise; type WorkbookCapture = Record; type SheetCapture = { headers: string[]; descriptions?: Record | null; data: Flatfile.RecordData[]; metadata?: { rowHeaders: number[] }; }; ``` ## Troubleshooting Common Issues ### Files Not Processing **Symptoms**: Files upload but no extraction occurs **Solutions**: * Verify file extension matches `fileExt` configuration * Check [Listener](/core-concepts/listeners) is properly deployed and running * Enable debug logging to see processing details ```javascript const extractor = CustomExtractor({ debug: true }); // Make sure file extensions match in the Extractor call ``` ### Parser Errors **Symptoms**: Jobs fail with parsing errors **Solutions**: * Add try-catch blocks in parser function * Validate input data before processing * Return helpful error messages ```javascript function parseCustomFormat(buffer) { try { const content = buffer.toString('utf-8'); if (!content || content.trim() === '') { throw new Error('File is empty'); } // ... parsing logic } catch (error) { throw new Error(`Parse error: ${error.message}`); } } ``` ### Memory Issues **Symptoms**: Large files cause timeouts or memory errors **Solutions**: * Reduce chunk size for large files * Implement streaming for very large files * Use parallel processing carefully ```javascript const extractor = CustomExtractor({ chunkSize: 1000, // Smaller chunks parallel: 1 // Reduce parallelization }); ``` ### Performance Problems **Symptoms**: Slow processing, timeouts **Solutions**: * Optimize parser algorithm * Use appropriate chunk sizes * Consider parallel processing for I/O-bound operations ```javascript // Optimize for large files const extractor = CustomExtractor({ chunkSize: 5000, parallel: 3 }); ``` # Data Egress Source: https://flatfile.com/docs/guides/egress Get data out of Flatfile using various transmission methods and file formats After your customers have imported and validated their data in Flatfile, you need to get that data to its final destination. Data egress refers to the process of extracting and transmitting data from Flatfile to external systems, files, or destinations. ## Egress Methods Since Flatfile listeners run in TypeScript/JavaScript environments, you can egress data using any mechanism available in the language ecosystem. This includes direct API calls, file generation, database connections, message queues, or any custom integration pattern your application requires. ### API Transmission Send data directly to external systems via HTTP requests to REST APIs, GraphQL endpoints, streaming services, or database APIs. Use any HTTP client library or built-in fetch to POST data to your existing endpoints. ### File Generation & Download Build files in any format - CSV, XLSX, XML, JSON, PDF, or custom formats - using the appropriate TypeScript libraries. Generate formatted reports, structured documents, or data exports that users can download directly. ### File Transfer Transmit generated files to external destinations like cloud storage, FTP servers, email attachments, or content delivery networks using the corresponding SDK or protocol library. ## Example: API Egress with Workbook Actions The following example demonstrates egressing data via HTTP POST when users complete a workbook. This pattern listens for a workbook action and sends the data to an external webhook endpoint. ### Implementation The example listener responds to `workbook:submitActionFg` events, retrieves all sheets and records from the workbook, and posts the data to a configured webhook URL. It uses the `event.secrets()` method to securely access the destination URL. ### Testing with webhook.site To test this pattern: 1. Navigate to [webhook.site](https://webhook.site) and copy your unique URL 2. Add the URL as a Flatfile secret named `WEBHOOK_SITE_URL` 3. Configure a workbook action in your blueprint to trigger the egress 4. The listener will POST the workbook data to your webhook.site URL where you can inspect the payload ### Example Projects * [TypeScript implementation](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/blob/main/typescript/egress/index.ts) * [JavaScript implementation](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/blob/main/javascript/egress/index.js) ## Example: File Export with Export Workbook Plugin For file generation and download, Flatfile provides the Export Workbook Plugin that generates Excel files from workbook data. This plugin handles the entire process of extracting data, formatting it into Excel sheets, and making it available for download. ### Installation ```bash npm install @flatfile/plugin-export-workbook ``` ### Implementation ```javascript import { FlatfileListener } from "@flatfile/listener"; import { exportWorkbookPlugin } from "@flatfile/plugin-export-workbook"; export default function (listener) { listener.use( exportWorkbookPlugin({ // Only export validated records recordFilter: 'valid', // Automatically trigger download autoDownload: true, // Custom filename filename: 'exported-data', // Transform column names for readability columnNameTransformer: (columnName) => { return columnName.replace(/([A-Z])/g, ' $1').replace(/^./, str => str.toUpperCase()); } }) ); } ``` ### Workbook Configuration Add a download action to your workbook to trigger the export: ```javascript { name: "Customer Data", actions: [ { operation: "downloadWorkbook", mode: "foreground", label: "Download Excel File", description: "Export all data as Excel spreadsheet", primary: true } ], sheets: [ // your sheet definitions ] } ``` When users click the download action, the plugin generates an Excel file containing all workbook data and either triggers an automatic download or directs users to the Files page to download manually. For complete configuration options and advanced usage, see the [Export Workbook Plugin documentation](/plugins/export-workbook). ## Example: Custom XML File Generation For custom file formats, you can build files programmatically using any TypeScript library. This example demonstrates creating XML files using the `xml2js` library to generate structured data exports. ### Installation ```bash npm install xml2js npm install @types/xml2js --save-dev ``` ### Implementation ```javascript import api from "@flatfile/api"; import { FlatfileListener } from "@flatfile/listener"; import { Builder } from 'xml2js'; import fs from 'fs'; export default function (listener) { listener.on( "job:ready", { job: "workbook:downloadXML" }, async ({ context: { jobId, workbookId } }) => { try { await api.jobs.ack(jobId, { info: "Starting XML generation...", progress: 10, }); // Get workbook and find the target sheet const workbook = await api.workbooks.get(workbookId); const sheet = workbook?.data.sheets?.find(sheet => sheet.slug === "customers"); const records = await api.records.get(sheet?.id || ""); await api.jobs.ack(jobId, { info: "Processing records...", progress: 30, }); // Initialize XML builder const builder = new Builder({ xmldec: { version: '1.0', encoding: 'UTF-8' } }); // Transform records to structured data const customers = records.data.records.map(record => { const values = record.values; return { id: values.id?.value || '', name: values.name?.value || '', email: values.email?.value || '', phone: values.phone?.value || '', address: { street: values.street?.value || '', city: values.city?.value || '', state: values.state?.value || '', zip: values.zip?.value || '' } }; }); // Create XML structure const xmlObj = { export: { $: { generated: new Date().toISOString(), recordCount: customers.length }, customers: { customer: customers } } }; // Generate XML string const xml = builder.buildObject(xmlObj); await api.jobs.ack(jobId, { info: "Creating XML file...", progress: 70, }); // Write XML to temporary file const fileName = `customer_export_${new Date().toISOString().split('T')[0]}.xml`; fs.writeFileSync(fileName, xml); // Upload file to Flatfile const file = fs.createReadStream(fileName); const fileUpload = await api.files.upload(file, { spaceId: workbook.data.spaceId, environmentId: workbook.data.environmentId, }); // Complete job with download link await api.jobs.complete(jobId, { outcome: { message: "XML file generated successfully", next: { type: "files", label: "Download XML", files: [{ fileId: fileUpload?.data?.id }], }, }, }); // Clean up temporary file fs.unlinkSync(fileName); } catch (error) { await api.jobs.fail(jobId, { outcome: { message: `Failed to generate XML: ${error.message}`, }, }); } } ); } ``` ### Workbook Configuration Add an XML download action to trigger the export: ```javascript { name: "Customer Data", actions: [ { operation: "downloadXML", mode: "foreground", label: "Download XML", description: "Export data as XML file", primary: false } ], sheets: [ { name: "Customers", slug: "customers", fields: [ { key: "id", type: "string", label: "Customer ID" }, { key: "name", type: "string", label: "Name" }, { key: "email", type: "string", label: "Email" }, { key: "phone", type: "string", label: "Phone" }, { key: "street", type: "string", label: "Street" }, { key: "city", type: "string", label: "City" }, { key: "state", type: "string", label: "State" }, { key: "zip", type: "string", label: "ZIP Code" } ] } ] } ``` This pattern can be adapted for any file format by substituting the appropriate library and transformation logic. The key steps remain the same: fetch data, transform it, generate the file, upload it to Flatfile, and provide a download link to users. # Multi-Part Jobs Source: https://flatfile.com/docs/guides/multi-part-jobs Split up Jobs into Parts A Job can be split up into multiple parts depending on the intended functionality. This can help spread out large data sets or long tasks across many separate listeners. When a job is split up into parts, the original part is considered the "parent" part and it's children are considered "parts". Each Part is considered it's own separate Job and has the full ecosystem of events attached to it. When all Job Parts have been completed, a new event `job:parts-completed` is emitted, so that you can listen for that event to complete the Parent Job. This event has a payload that has summary statistics about the parts. ```json { "payload": { "parts": { "total": 10, "completed": 10, "failed": 0, "canceled": 0 } } } ``` By inspecting this payload, you can determine if the all have completed successfully or not. Based on that information, you are expected to complete or fail the parent Job. Here's an example of a listener that creates parts and waits for the `job:parts-completed` event: ```typescript import api from "@flatfile/api"; import { FlatfileListener } from "@flatfile/listener"; export default function (listener: FlatfileListener) { // Parent and Part Jobs have the same operation name. // Filtering by isPart: false, ensures that the Job is the Parent // The job operation name is an example, you can define your own. listener.filter( { job: "sheet:submitLargeSheet", isPart: false }, (submitLargeSheet) => { submitLargeSheet.on("job:ready", async (event) => { const { context: { jobId, sheetId }, } = event; console.log("job:ready [PARENT]", { jobId: event.context.jobId }); const { data: counts } = await api.sheets.getRecordCounts(sheetId); const { total } = counts.counts; await api.jobs.ack(jobId, { info: `Splitting Job`, progress: 10, }); console.log("splitting job: ", { jobId, total }); const batchSize = 1000; const totalParts = Math.ceil(total / batchSize); const splitjob = await api.jobs.split(jobId, { parts: totalParts }); console.log("splitjob: ", { splitjob }); await api.jobs.ack(jobId, { info: `Job Split into ${total} parts.`, progress: 20, }); }); // Listen for all parts to finish and then complete the parent Job submitLargeSheet.on("job:parts-completed", async (event) => { const { context: { jobId }, } = event; console.log("job:parts-completed: ", jobId); await api.jobs.complete(jobId, { outcome: { message: "This job is now complete.", }, }); }); } ); } ``` Here's an example of a listener that does logic on each Part: ```typescript import api from "@flatfile/api"; import { FlatfileListener } from "@flatfile/listener"; export default function (listener: FlatfileListener) { // Parent and Part Jobs have the same operation name. // Filtering by isPart: true, ensures that the Job is a Part // The job operation name is an example, you can define your own. listener.filter( { job: "sheet:submitLargeSheet", isPart: true }, (submitLargeSheet) => { submitLargeSheet.on("job:ready", async (event) => { const { context: { jobId }, } = event; await api.jobs.ack(jobId); const job = await api.jobs.get(jobId); console.dir({ job }, { depth: 10 }); const { partData, parentId } = job.data; const { records } = await event.data({ pageSize: 1, pageNumber: partData.part + 1, }); console.log({ record: records[0].values }); console.log("submitting part: ", jobId, partData); await new Promise((r) => setTimeout(r, 1000)); await api.jobs.complete(jobId, { outcome: { message: "This job is now complete.", }, }); }); } ); } ``` # Namespaces and Filters Source: https://flatfile.com/docs/guides/namespaces-and-filters Isolate and organize your Flatfile Listeners using namespace filtering and advanced Event filtering patterns Flatfile provides two complementary approaches for organizing and scoping your Event handling: **namespaces** and **Event filters**. Together, they enable sophisticated Event routing, better code organization, and precise control over which [Listeners](/core-concepts/listeners) respond to specific Events. *Organize and isolate different parts of your application at the App, Space, and Workbook level.* *Target specific Sheets or event properties to create precise Event handling rules for your Listeners.* Both approaches help organize and scope Event handling, reduce noise by ensuring Listeners only respond to relevant Events, and can be combined for the most flexible and maintainable Event organization. ## Namespaces ### Understanding Namespaces Namespaces are simple string identifiers that you can assign to [Apps](/core-concepts/apps), [Spaces](/core-concepts/spaces), and [Workbooks](/core-concepts/workbooks) to organize and isolate different parts of your Flatfile application. When you assign a namespace to a resource, Events from that resource can be filtered using Listener namespace patterns. #### How Namespace Filtering Works Listeners use prefix patterns to filter Events based on the namespace of the resource that generated the Event: * `space:namespace` - Listen to Events from [Spaces](/core-concepts/spaces) with the given namespace or belonging to [Apps](/core-concepts/apps) with that namespace * `workbook:namespace` - Listen to Events from [Workbooks](/core-concepts/workbooks) with the given namespace ### App Namespaces The most common namespace pattern is creating separate [Apps](/core-concepts/apps) with distinct namespaces, then using namespaced Listeners to handle different configurations for each app. #### Setting App Namespaces Namespaces are set when creating an App via the [Flatfile Dashboard](https://platform.flatfile.com/dashboard): Let's walk through an example of routing events to different Listeners based on the App namespace. Imagine creating three separate Apps with distinct namespaces: **App Name:** Customer Portal\ **Namespace:** `customer-portal`\ **Purpose:** External customer data import **App Name:** Internal Tools\ **Namespace:** `internal-tools`\ **Purpose:** Admin and operations **App Name:** Partner Integration\ **Namespace:** `partner-integration`\ **Purpose:** B2B data exchange #### Listening to App Namespace Events Now you can use `listener.namespace()` to create separate Listeners for Spaces within each App. This will provide a new, filtered Listener object scoped to your callback function. This pattern allows each App to have completely different: * Space configurations and blueprints * Data validation rules * Processing workflows * User experiences * Integration behaviors ```javascript JavaScript export default function (listener) { // Customer Portal - External customer data import listener.namespace('space:customer-portal', (customerListener) => { customerListener.use(configureCustomerPortalSpace); customerListener.use(configureGuidedSetup); customerListener.use(validateCustomerData); }); // Internal Tools - Admin and operations listener.namespace('space:internal-tools', (internalListener) => { internalListener.use(configureInternalToolsSpace); internalListener.use(validateAdvancedRules); internalListener.use(applyBusinessLogic); }); // Partner Integration - B2B data exchange listener.namespace('space:partner-integration', (partnerListener) => { partnerListener.use(configurePartnerIntegrationSpace); partnerListener.use(processAutomatically); partnerListener.use(syncPartnerData); }); } ``` ```typescript TypeScript import { FlatfileListener } from '@flatfile/listener'; export default function (listener: FlatfileListener) { // Customer Portal - External customer data import listener.namespace('space:customer-portal', (customerListener: FlatfileListener) => { customerListener.use(configureCustomerPortalSpace); customerListener.use(configureGuidedSetup); customerListener.use(validateCustomerData); }); // Internal Tools - Admin and operations listener.namespace('space:internal-tools', (internalListener: FlatfileListener) => { internalListener.use(configureInternalToolsSpace); internalListener.use(validateAdvancedRules); internalListener.use(applyBusinessLogic); }); // Partner Integration - B2B data exchange listener.namespace('space:partner-integration', (partnerListener: FlatfileListener) => { partnerListener.use(configurePartnerIntegrationSpace); partnerListener.use(processAutomatically); partnerListener.use(syncPartnerData); }); } ``` For more information on Events and Listeners (including `listener.use()`, as in this example), see [Events and Listeners](/core-concepts/listeners). ### Space Namespaces Beyond App-level namespacing, you can also apply namespaces directly to individual Spaces, enabling unique configurations and behaviors on a space-by-space basis. **Important**: [Spaces](/core-concepts/spaces) inherit their [App's](/core-concepts/apps) namespace by default. When you create [listeners](/core-concepts/listeners) that filter on `space:app-namespace`, they receive Events from spaces within that App. If you override a Space's namespace to be different from its App, Listeners filtering on the App's namespace will no longer receive Events from that Space. For example: 1. App has namespace `"my-app"` 2. You create a Space with explicit namespace `"my-space"` 3. Listeners filtering on `space:my-space` *will* receive Events from that Space 4. Listeners filtering on `space:my-app` *will not* receive Events from that Space This applies regardless of whether the Space belongs to that App. Any other Spaces created without explicit namespaces will continue to inherit the App namespace normally. #### Setting Space Namespaces Assign a namespace when creating a Space via the API: ```javascript JavaScript const templateSpace = await api.spaces.create({ name: "Template Manager", namespace: "templates", // Simple string identifier // ...space configuration }); ``` ```typescript TypeScript import { Flatfile } from '@flatfile/api'; const templateSpace = await api.spaces.create({ name: "Template Manager", namespace: "templates", // Simple string identifier // ...space configuration }); ``` You can also set the namespace during the `space:configure` [Job](/core-concepts/jobs) by updating the Space. This may be useful if you're creating Spaces from the [Flatfile Dashboard](https://platform.flatfile.com/dashboard), which doesn't support setting a namespace during creation. This example shows how to set the namespace based on the Space name, allowing you to have multiple configurations in the same App: In this example, we'll show the full [Job](/core-concepts/jobs) Listener lifecycle implementation, complete with `ack` to acknowledge the job, `update` to report progress, and `complete` or `fail` to finish the job. However, for most implementations, we recommend using the [Space Configure](/plugins/space-configure) plugin. This plugin takes care of even more of the heavy lifting for you; not only does it handle the Job lifecycle, but it also takes care of all of the API calls necessary to configure the Space and create its Workbooks and documents. ```javascript JavaScript // Set namespace based on space name patterns listener.on("job:ready", { job: "space:configure" }, async (event) => { const { jobId, spaceId } = event.context; try { await api.jobs.ack(jobId, { info: "Configuring space with namespace" }); // Get Space details to determine namespace const space = await api.spaces.get(spaceId); // Route spaces to different namespaces based on naming conventions let namespace; if (space.data.name.toLowerCase().endsWith('portal')) { namespace = 'customer-portal'; } else if (space.data.name.toLowerCase().endsWith('admin')) { namespace = 'internal-tools'; } else if (space.data.name.toLowerCase().endsWith('partner')) { namespace = 'partner-integration'; } else { namespace = 'general'; // Default namespace } await api.spaces.update(spaceId, { namespace: namespace, }); // ...create workbooks, sheets, documents await api.jobs.complete(jobId, { outcome: { message: `Space configured with namespace: ${namespace}` } }); } catch (error) { console.error(error); await api.jobs.fail(jobId, { outcome: { message: `Configuration failed: ${error.message}` } }); } }); ``` ```typescript TypeScript import { FlatfileListener, FlatfileEvent } from '@flatfile/listener'; import { Flatfile } from '@flatfile/api'; // Set namespace based on space name patterns listener.on("job:ready", { job: "space:configure" }, async (event: FlatfileEvent) => { const { jobId, spaceId } = event.context; try { await api.jobs.ack(jobId, { info: "Configuring space with namespace" }); // Get Space details to determine namespace const space = await api.spaces.get(spaceId); // Route spaces to different namespaces based on naming conventions let namespace: string; if (space.data.name.toLowerCase().endsWith('portal')) { namespace = 'customer-portal'; } else if (space.data.name.toLowerCase().endsWith('admin')) { namespace = 'internal-tools'; } else if (space.data.name.toLowerCase().endsWith('partner')) { namespace = 'partner-integration'; } else { namespace = 'general'; // Default namespace } await api.spaces.update(spaceId, { namespace: namespace, }); // ...create workbooks, sheets, documents await api.jobs.complete(jobId, { outcome: { message: `Space configured with namespace: ${namespace}` } }); } catch (error) { console.error(error); await api.jobs.fail(jobId, { outcome: { message: `Configuration failed: ${error.message}` } }); } }); ``` For complete Space configuration examples, see [Creating Spaces](/core-concepts/spaces#creating-spaces). #### Listening to Space Namespace Events Use `listener.namespace()` to filter Events from Spaces with specific namespaces. This will provide a new, filtered Listener object scoped to your callback function: ```javascript JavaScript // Listen to Events from customer portal Spaces listener.namespace('space:customer-portal', (customerPortalListener) => { customerPortalListener.use(applyCustomerBranding); customerPortalListener.use(configureGuidedOnboarding); }); // Listen to Events from internal tools Spaces listener.namespace('space:internal-tools', (internalToolsListener) => { internalToolsListener.use(validateAdminData); internalToolsListener.use(enableAuditLogging); }); // Listen to Events from partner integration Spaces listener.namespace('space:partner-integration', (partnerIntegrationListener) => { partnerIntegrationListener.use(configureApiWebhooks); partnerIntegrationListener.use(enableBulkProcessing); }); // Listen to Events from general Spaces (default) listener.namespace('space:general', (generalSpaceListener) => { generalSpaceListener.use(validateBasicData); }); ``` ```typescript TypeScript import { FlatfileListener } from '@flatfile/listener'; // Listen to Events from customer portal Spaces listener.namespace('space:customer-portal', (customerPortalListener: FlatfileListener) => { customerPortalListener.use(applyCustomerBranding); customerPortalListener.use(configureGuidedOnboarding); }); // Listen to Events from internal tools Spaces listener.namespace('space:internal-tools', (internalToolsListener: FlatfileListener) => { internalToolsListener.use(validateAdminData); internalToolsListener.use(enableAuditLogging); }); // Listen to Events from partner integration Spaces listener.namespace('space:partner-integration', (partnerIntegrationListener: FlatfileListener) => { partnerIntegrationListener.use(configureApiWebhooks); partnerIntegrationListener.use(enableBulkProcessing); }); // Listen to Events from general Spaces (default) listener.namespace('space:general', (generalSpaceListener: FlatfileListener) => { generalSpaceListener.use(validateBasicData); }); ``` ### Workbook Namespaces [Workbooks](/core-concepts/workbooks) can also have namespaces for more granular Event filtering within the same [Space](/core-concepts/spaces). This can be set directly in the Workbook's [Blueprint](/core-concepts/blueprints). This example configures a structure with two Workbooks in the same Space, with a flow for moving data from the staging Workbook to the production Workbook. #### Setting Workbook Namespaces ```javascript JavaScript const workbook = { name: 'Employee Data Processing', namespace: 'staging', // Simple string namespace sheets: [employeesSheet, departmentsSheet, payrollSheet] }; ``` ```typescript TypeScript import { Workbook } from '@flatfile/api'; const workbook: Workbook = { name: 'Employee Data Processing', namespace: 'staging', // Simple string namespace sheets: [employeesSheet, departmentsSheet, payrollSheet] }; ``` ```javascript JavaScript const workbook = { name: 'Employee Data Processing: ', namespace: 'production', // Simple string namespace sheets: [employeesSheet, departmentsSheet, payrollSheet] }; ``` ```typescript TypeScript import { Workbook } from '@flatfile/api'; const workbook: Workbook = { name: 'Employee Data Processing: ', namespace: 'production', // Simple string namespace sheets: [employeesSheet, departmentsSheet, payrollSheet] }; ``` #### Listening to Workbook Namespace Events Events from Workbooks with namespaces are filtered using the `workbook:namespace` pattern. ```javascript JavaScript listener.use(configureAllWorkbooks); listener.use(validateData); // Listen to Events from Workbooks with "staging" namespace listener.namespace('workbook:staging', (stagingWorkbookListener) => { stagingWorkbookListener.use(migrateStagingDataToProduction); }); // Listen to Events from Workbooks with "production" namespace listener.namespace('workbook:production', (productionWorkbookListener) => { productionWorkbookListener.use(applyBusinessLogic); productionWorkbookListener.use(enableAuditing); }); ``` ```typescript TypeScript import { FlatfileListener } from '@flatfile/listener'; listener.use(configureAllWorkbooks); listener.use(validateData); // Listen to Events from Workbooks with "staging" namespace listener.namespace('workbook:staging', (stagingWorkbookListener: FlatfileListener) => { stagingWorkbookListener.use(migrateStagingDataToProduction); }); // Listen to Events from Workbooks with "production" namespace listener.namespace('workbook:production', (productionWorkbookListener: FlatfileListener) => { productionWorkbookListener.use(applyBusinessLogic); productionWorkbookListener.use(enableAuditing); }); ``` ### Listening to Multiple Namespaces The `listener.namespace()` function can accept an array of namespace patterns as its first argument, allowing you to listen to Events from multiple namespaces with a single Listener configuration. This is useful when you want to apply the same processing logic across different namespaces. You can also mix different namespace types in the same array. The Listener in this example will receive Events from both the `space:admin-tools` and `workbook:critical-data` namespaces: ```javascript JavaScript // Listen to events from specific Spaces and Workbooks listener.namespace(['space:admin-tools', 'workbook:critical-data'], (adminCriticalListener) => { adminCriticalListener.use(enableHighSecurityMode); adminCriticalListener.use(requireApproval); adminCriticalListener.use(flagForReview); }); ``` ```typescript TypeScript import { FlatfileListener } from '@flatfile/listener'; // Listen to events from specific Spaces and Workbooks listener.namespace(['space:admin-tools', 'workbook:critical-data'], (adminCriticalListener: FlatfileListener) => { adminCriticalListener.use(enableHighSecurityMode); adminCriticalListener.use(requireApproval); adminCriticalListener.use(flagForReview); }); ``` ### Nested Namespace Example You can also nest namespacing: let's say you have two Apps (`customer-portal` and `vendor-portal`), and each App processes data through different Workbook types (`invoices` and `orders`). You want to ensure that each App's Workbooks are completely isolated from the other App, but within each App you want specific handling for each Workbook type. This example demonstrates how App, Space, and Workbook namespaces work together with nested Listeners: #### Setting Up Multiple Apps First, create two Apps via the [Flatfile Dashboard](https://platform.flatfile.com/dashboard), each with distinct namespaces: **App Name:** Customer Portal\ **Namespace:** `customer-portal`\ **Purpose:** External customer data import and processing **App Name:** Vendor Portal\ **Namespace:** `vendor-portal`\ **Purpose:** B2B vendor data exchange and management #### Nested Listener Configuration To help reduce scrolling for this example, we've split it into two tabs: **Blueprints** and **Listener Configuration**. Each App requires different field structures for their invoice and order processing, so we define App-specific Blueprint configurations. See the next tab for the Listener Configuration. ```javascript JavaScript // Customer Portal Blueprint configurations const customerInvoiceWorkbook = { name: "Customer Invoice Processing", namespace: "invoices", sheets: [{ name: "Customer Invoices", slug: "invoices", fields: [ { key: "invoice_number", type: "string", label: "Invoice Number" }, { key: "customer_name", type: "string", label: "Customer Name" }, { key: "billing_amount", type: "number", label: "Billing Amount" } ] }] }; const customerOrderWorkbook = { name: "Customer Order Processing", namespace: "orders", sheets: [{ name: "Customer Orders", slug: "orders", fields: [ { key: "order_id", type: "string", label: "Order ID" }, { key: "customer_name", type: "string", label: "Customer Name" }, { key: "order_total", type: "number", label: "Order Total" } ] }] }; // Vendor Portal Blueprint configurations const vendorInvoiceWorkbook = { name: "Vendor Invoice Processing", namespace: "invoices", sheets: [{ name: "Vendor Invoices", slug: "invoices", fields: [ { key: "vendor_invoice_id", type: "string", label: "Vendor Invoice ID" }, { key: "vendor_company", type: "string", label: "Vendor Company" }, { key: "payment_due", type: "number", label: "Payment Due" } ] }] }; const vendorOrderWorkbook = { name: "Vendor Purchase Orders", namespace: "orders", sheets: [{ name: "Purchase Orders", slug: "orders", fields: [ { key: "po_number", type: "string", label: "PO Number" }, { key: "supplier_name", type: "string", label: "Supplier Name" }, { key: "purchase_amount", type: "number", label: "Purchase Amount" } ] }] }; ``` ```typescript TypeScript import { Flatfile } from '@flatfile/api'; // Customer Portal Blueprint configurations const customerInvoiceWorkbook: Flatfile.CreateWorkbookConfig = { name: "Customer Invoice Processing", namespace: "invoices", sheets: [{ name: "Customer Invoices", slug: "invoices", fields: [ { key: "invoice_number", type: "string", label: "Invoice Number" }, { key: "customer_name", type: "string", label: "Customer Name" }, { key: "billing_amount", type: "number", label: "Billing Amount" } ] }] }; const customerOrderWorkbook: Flatfile.CreateWorkbookConfig = { name: "Customer Order Processing", namespace: "orders", sheets: [{ name: "Customer Orders", slug: "orders", fields: [ { key: "order_id", type: "string", label: "Order ID" }, { key: "customer_name", type: "string", label: "Customer Name" }, { key: "order_total", type: "number", label: "Order Total" } ] }] }; // Vendor Portal Blueprint configurations const vendorInvoiceWorkbook: Flatfile.CreateWorkbookConfig = { name: "Vendor Invoice Processing", namespace: "invoices", sheets: [{ name: "Vendor Invoices", slug: "invoices", fields: [ { key: "vendor_invoice_id", type: "string", label: "Vendor Invoice ID" }, { key: "vendor_company", type: "string", label: "Vendor Company" }, { key: "payment_due", type: "number", label: "Payment Due" } ] }] }; const vendorOrderWorkbook: Flatfile.CreateWorkbookConfig = { name: "Vendor Purchase Orders", namespace: "orders", sheets: [{ name: "Purchase Orders", slug: "orders", fields: [ { key: "po_number", type: "string", label: "PO Number" }, { key: "supplier_name", type: "string", label: "Supplier Name" }, { key: "purchase_amount", type: "number", label: "Purchase Amount" } ] }] }; ``` Unlike other examples, this one uses the [Space Configure](/plugins/space-configure) plugin to configure the Space. The alternative – Space configuration using the full Job Lifecycle management approach – is more instructive, but significantly more verbose. See the [Space Configuration](/core-concepts/spaces#space-configuration) section for examples of that. The Space Configure plugin takes a lot of heavy lifting off your hands; not only does it handle the Job lifecycle, but it also takes care of all of the API calls necessary to configure the Space and create its Workbooks and documents. With this plugin, you can configure your entire space with a single configuration object rather than perforing any API calls. ```javascript JavaScript import { configureSpace } from '@flatfile/plugin-space-configure'; export default function (listener) { // Namespaced for the Customer Portal App listener.namespace("space:customer-portal", (customerListener) => { // Configure Space using the plugin customerListener.use(configureSpace({ workbooks: [customerInvoiceWorkbook, customerOrderWorkbook] })); // Handle Events from Invoice Workbooks in the Customer Portal App customerListener.namespace("workbook:invoices", (customerInvoiceListener) => { customerInvoiceListener.use(validateCustomerInvoiceData); }); // Handle Events from Order Workbooks in the Customer Portal App customerListener.namespace("workbook:orders", (customerOrderListener) => { customerOrderListener.use(validateCustomerOrderData); }); }); // Namespaced for the Vendor Portal App listener.namespace("space:vendor-portal", (vendorListener) => { // Configure Space using the plugin vendorListener.use(configureSpace({ workbooks: [vendorInvoiceWorkbook, vendorOrderWorkbook] })); // Handle Events from Invoice Workbooks in the Vendor Portal App vendorListener.namespace("workbook:invoices", (vendorInvoiceListener) => { vendorInvoiceListener.use(validateVendorInvoices); }); // Handle Events from Order Workbooks in the Vendor Portal App vendorListener.namespace("workbook:orders", (vendorOrderListener) => { vendorOrderListener.use(validateVendorOrders); }); }); } ``` ```typescript TypeScript import { FlatfileListener } from '@flatfile/listener'; import { Flatfile } from '@flatfile/api'; import { configureSpace } from '@flatfile/plugin-space-configure'; export default function (listener: FlatfileListener) { // Namespaced for the Customer Portal App listener.namespace("space:customer-portal", (customerListener: FlatfileListener) => { // Configure Space using the plugin customerListener.use(configureSpace({ workbooks: [customerInvoiceWorkbook, customerOrderWorkbook] })); // Handle Events from Invoice Workbooks in the Customer Portal App customerListener.namespace("workbook:invoices", (customerInvoiceListener: FlatfileListener) => { customerInvoiceListener.use(validateCustomerInvoiceData); }); // Handle Events from Order Workbooks in the Customer Portal App customerListener.namespace("workbook:orders", (customerOrderListener: FlatfileListener) => { customerOrderListener.use(validateCustomerOrderData); }); }); // Namespaced for the Vendor Portal App listener.namespace("space:vendor-portal", (vendorListener: FlatfileListener) => { // Configure Space using the plugin vendorListener.use(configureSpace({ workbooks: [vendorInvoiceWorkbook, vendorOrderWorkbook] })); // Handle Events from Invoice Workbooks in the Vendor Portal App vendorListener.namespace("workbook:invoices", (vendorInvoiceListener: FlatfileListener) => { vendorInvoiceListener.use(validateVendorInvoices); }); // Handle Events from Order Workbooks in the Vendor Portal App vendorListener.namespace("workbook:orders", (vendorOrderListener: FlatfileListener) => { vendorOrderListener.use(validateVendorOrders); }); }); } ``` ## Event Filters ### Understanding Event Filtering The `listener.filter()` method creates filtered Listener instances that only respond to Events matching specific criteria, returning a new `FlatfileListener` instance with the applied filter conditions. ### Basic Filtering The most fundamental use of `listener.filter()` is to respond to specific Events based on simple criteria. This approach is useful when you want different processing logic for different types of Jobs or Events within the same namespace, without creating separate Event handlers for each case. ```javascript JavaScript // With callback function for multiple handlers listener.filter({ sheet: 'contacts' }, (contactsListener) => { contactsListener.on('commit:created', async (event) => { console.log('Contact data committed'); // Process contact data validation }); contactsListener.on('records:created', async (event) => { console.log('New contacts added'); // Handle contact creation workflow }); }); // Chaining `filter()` with `on()` listener .filter({ sheet: 'contacts' }) .on('records:updated', async (event) => { console.log('Contact updated'); // Handle contact updates }); ``` ```typescript TypeScript import { FlatfileListener, FlatfileEvent } from '@flatfile/listener'; // With callback function for multiple handlers listener.filter({ sheet: 'contacts' }, (contactsListener: FlatfileListener) => { contactsListener.on('commit:created', async (event: FlatfileEvent) => { console.log('Contact data committed'); // Process contact data validation }); contactsListener.on('records:created', async (event: FlatfileEvent) => { console.log('New contacts added'); // Handle contact creation workflow }); }); // Chaining `filter()` with `on()` listener .filter({ sheet: 'contacts' }) .on('records:updated', async (event: FlatfileEvent) => { console.log('Contact updated'); // Handle contact updates }); ``` ### Wildcard Filtering Filters support wildcard patterns using `*` to match partial values. This may be useful when you want to filter by ID patterns or prefixes. This example filters for `commit:created` events that were initiated by [Jobs](/core-concepts/jobs) rather than users. When a Job causes changes to your data (commits), the `actorId` in the event context will be the job ID (starting with `"us_jb"`): ```javascript JavaScript // Use wildcard to filter for events initiated by jobs (actorId starts with "us_jb") listener .filter({actorId: "us_jb*"}) // note the * wildcard .on("commit:created", async (event) => { // Get the job details that caused this commit const { data: job } = await api.jobs.get(event.context.actorId); console.log(`Job ${job.operation} caused a commit in sheet ${event.context.sheetId}`); // ...React to job-initiated commits }); ``` ```typescript TypeScript import { FlatfileListener, FlatfileEvent } from '@flatfile/listener'; // Use wildcard to filter for events initiated by jobs (actorId starts with "us_jb") listener .filter({actorId: "us_jb*"}) // note the * wildcard .on("commit:created", async (event: FlatfileEvent) => { // Get the job details that caused this commit const { data: job } = await api.jobs.get(event.context.actorId); console.log(`Job ${job.operation} caused a commit in sheet ${event.context.sheetId}`); // ...React to job-initiated commits }); ``` ### Chaining Filters and Namespaces You can also chain multiple filters along with namespaces to isolate highly specific events. This example shows how to combine a namespace with two filters to handle failed submit jobs for third-party integrations: ```javascript JavaScript // Progressive filtering for highly specific event targeting listener .namespace('space:third-party-integrations') .filter({job: `workbook:submit`}) .filter({"payload.status": "failed"}) .on('job:updated', async (event) => { const { data: job } = await api.jobs.get(event.context.jobId); handleFailedThirdPartySubmissions(job); }); ``` ```typescript TypeScript import { FlatfileListener, FlatfileEvent } from '@flatfile/listener'; // Progressive filtering for highly specific event targeting listener .namespace('space:third-party-integrations') .filter({job: `workbook:submit`}) .filter({"payload.status": "failed"}) .on('job:updated', async (event: FlatfileEvent) => { const { data: job } = await api.jobs.get(event.context.jobId); handleFailedThirdPartySubmissions(job); }); ``` ### Filter Properties The `listener.filter()` method accepts an object defining filter criteria based on Event properties. **Important**: This is not intended to be a comprehensive list of all possible filter properties, but a reference for commonly-used ones. Event properties vary significantly by event type, and using a filter property that doesn't exist for an event type will result in no matches. Refer to the [Event Reference](/reference/events) for the specific properties available for each event you want to filter on. You can always `console.log()` your events prior to filtering to see what properties are available. #### Universal Properties These properties are available for filtering across most or all event types: | Property | Description | Example | | --------------- | --------------------------------- | --------------------------------- | | `topic` | Event topic pattern | `{ topic: 'records:created' }` | | `domain` | Event domain (supports wildcards) | `{ domain: 'space' }` | | `environmentId` | Environment identifier | `{ environmentId: 'us_env_123' }` | #### Common Context Properties These properties are available in many events but not all: | Property | Description | Example | Available In | | ----------- | ------------------ | ----------------------------- | ------------------------------------ | | `spaceId` | Specific space | `{ spaceId: 'us_sp_789' }` | Most events (not environment events) | | `actorId` | Specific actor | `{ actorId: 'us_usr_123' }` | Most user-initiated events | | `accountId` | Account identifier | `{ accountId: 'us_acc_123' }` | Most events | #### Specific Context Properties These properties are only available for certain event types: | Property | Description | Example | Available In | | ------------ | ----------------- | ----------------------------- | ---------------------------------------- | | `workbookId` | Specific workbook | `{ workbookId: 'us_wb_456' }` | Workbook, sheet, record, some job events | | `sheetId` | Specific sheet ID | `{ sheetId: 'us_sh_123' }` | Sheet, record, program events | | `sheet` | Sheet slug | `{ sheet: 'contacts' }` | Sheet, record, program events | | `jobId` | Specific job ID | `{ jobId: 'us_jb_123' }` | Job events only | | `fileId` | File identifier | `{ fileId: 'us_fl_123' }` | File events only | #### Job Event Filters Special filter properties for job events: | Property | Description | Example | | ------------------- | ------------------- | -------------------------------------- | | `job` | Job type identifier | `{ job: 'workbook:submit' }` | | `payload.status` | Job status | `{ 'payload.status': 'failed' }` | | `payload.domain` | Event domain | `{ 'payload.domain': 'space' }` | | `payload.operation` | Operation name | `{ 'payload.operation': 'configure' }` | #### Record Event Filters Specific to record creation, update, and deletion events: | Property | Description | Example | | --------------------- | ---------------------------- | -------------------------------------- | | `payload.recordIds` | Array of affected record IDs | `{ 'payload.recordIds': ['rec_123'] }` | | `payload.recordCount` | Number of records affected | `{ 'payload.recordCount': 5 }` | | `payload.sheetId` | Sheet containing the records | `{ 'payload.sheetId': 'us_sh_123' }` | #### File Event Filters Specific to file upload and processing events: | Property | Description | Example | | -------------------- | ---------------------- | --------------------------------------- | | `payload.status` | File processing status | `{ 'payload.status': 'completed' }` | | `payload.workbookId` | Associated workbook | `{ 'payload.workbookId': 'us_wb_123' }` | You can also use various pattern matching approaches including exact matches, arrays of values, and wildcard patterns with a `*`. # Guides Overview Source: https://flatfile.com/docs/guides/overview Comprehensive guides to help you master Flatfile's features and capabilities Welcome to Flatfile's comprehensive guide collection. These guides provide step-by-step instructions and best practices for implementing advanced features and customizations in your Flatfile integration. ## Core Functionality Essential guides for working with Flatfile's core features and data processing capabilities. Complete guide to authenticating with Flatfile using API keys, Personal Access Tokens, and managing roles and permissions for your team and customers. Learn how to trigger operations based on user input. Actions allow you to create custom workflows that execute when users interact with buttons in your Flatfile interface. Master data processing with Data Hooks. This guide covers how to validate, transform, and process records as they're imported into Flatfile. Enable dynamic field creation during data import. Learn how to allow users to add new fields on the fly when their data doesn't match your predefined Blueprint. ## Data Processing & Filtering Advanced techniques for filtering, querying, and transforming your data. Create sophisticated filter combinations with complex conditions. Learn how to use logical operators, multiple criteria, and various data types to find exactly the records you need. Build custom file parsers to handle proprietary or uncommon file formats. Learn how to create extractors that can parse any file type and integrate seamlessly with Flatfile's processing pipeline. Master FFQL (Flatfile Query Language) to filter data in sheets with powerful query syntax. Build complex queries to search and manipulate your data programmatically. ## Configuration & Customization Guides for customizing the look, feel, and behavior of your Flatfile implementation. Customize the look and feel of Flatfile to match your brand. This comprehensive guide covers all theming options including colors, fonts, and UI elements. Implement internationalization with text overrides and translations. Make your Flatfile interface accessible to users in different languages and regions. Control what guests see in their sidebar interface. Learn how to limit and customize the sidebar experience for different user types. ## Data Management Guides for managing data lifecycle, retention, and export processes. Get data out of Flatfile to external destinations. Master various patterns for exporting and syncing data to your systems and third-party services. Store and manage descriptive information that provides additional context to your data. Learn how to effectively use metadata throughout your Flatfile implementation. Split complex operations into manageable parts for better performance and user experience. Learn how to break down large jobs into smaller, trackable components. ## Advanced Features Specialized guides for advanced Flatfile features and enterprise capabilities. Organize and isolate resources using namespaces. Learn how to narrow down the scope of Spaces, Workbooks, and Sheets for better organization and security. Securely manage and use credentials in your listeners. Learn best practices for handling sensitive information like API keys and authentication tokens. Standardized authentication setup patterns for consistent integration across all Flatfile projects. Standardized webhook patterns for reliable data submission to external systems. ## Getting Help Need additional assistance? Here are some resources: * **[Community](https://flatfile.com/join-slack/)** - Join our Slack community for questions and discussions * **[Support](https://flatfile.com/join-slack/)** - Get help from our support team * **[API Reference](https://reference.flatfile.com)** - Detailed API documentation * **[Platform](https://platform.flatfile.com)** - Access your Flatfile dashboard Each guide includes practical examples, code snippets, and best practices to help you implement these features effectively in your own applications. # Secrets Source: https://flatfile.com/docs/guides/share-secrets Securely use credentials in listeners With Secrets you can securely share credentials with listener implementations without developers explicitly knowing the secret values upfront. Secret values are set in the user interface, but retrieved via the SDK or API. ## Overview ### Creating Secrets Secrets in Flatfile, defined as Name/Value pairs, are securely stored and associated with an Environment or a Space. Spaces will inherit Secrets from their respective Environment but you may choose to override any Environment Secret for a given Space. To define Secrets shared with every Space in an Environment, navigate to the "Developer Settings" screen for that environment. To override an Environment value, navigate to the specific Space and select "Secrets" in the left navigation. While Flatfile encrypts all data, both during transit and at rest in our datastore, Secrets have an additional layer of protection. Secrets are encrypted/decrypted on demand using a unique set of keys. As such, a potential intruder would need not only access to the plaintext datastore, but also these extra keys to decrypt and compromise these sensitive values. ### Consuming Secrets While Secrets are defined in administrative interfaces for Environments and Spaces, respectively, they are designed to be consumed by Listeners. While it might be trivial to pass in secret values through environment variables in a self-hosted Listener, with a Flatfile hosted Agent based Listener one must use the Secrets features. See Usage below for some example consumer patterns. ## Usage Examples ### Sensitive Credentials The principal utility of Secrets lies in securely storing sensitive credentials/tokens within an Environment/Space for connecting Listeners to third-party APIs. For instance, you might store a secret named `SLACK_TOKEN` with a value, allowing you to communicate with a Slack bot each time a custom action is triggered. For complete examples of secure credential management, see the [Authentication Examples](/guides/deeper/auth-examples#secure-credential-management) guide. #### Example Listener In this example, we use an `event.secrets` call to pull a sensitive Slack token for use within a listener context. We then can use the credential to post a message to Slack. ```javascript export default function flatfileEventListener(listener) { //note: listening to all events with a wildcard can be used while testing but is not //recommended for production, as it will capture all events and may cause performance issues listener.on("**", async (event) => { const tok = await event.secrets("SLACK_TOKEN"); console.log(tok); /* pseudo code for an example slack = new Slack(tok); slack.api( "chat.postMessage", { text: "Flatfile event received!", channel: "#integration-flatfile", }, function (err, response) { console.log(response || err); } ); */ }); } ``` #### Example Listener using optional props The `options` parameter for the secrets fetch function allows optionally choosing a different Environment or Space than the event occurred within. ```javascript export default function flatfileEventListener(listener) { //note: listening to all events with a wildcard can be used while testing but is not //recommended for production, as it will capture all events and may cause performance issues listener.on("**", async (event) => { // Hardcode specific environment and space for this listener's case const credential = await event.secrets("MY_CREDENTIAL", { environmentId: "us_env_123", spaceId: "us_spa_123", }); console.log(credential); }); } ``` ## Metadata While it might seem creative to use the Secrets feature to hold non-sensitive metadata, we encourage you to learn more about utilizing metadata within your Spaces, Records, or Fields. # Theming Source: https://flatfile.com/docs/guides/theme-your-space Learn how to customize the look and feel of Flatfile to match your brand Flatfile supports modifying most UI elements including colors, fonts, borders, padding, and more via the [Space](/core-concepts/spaces) endpoint. 1. Start by simply updating `theme.root.primaryColor` and `theme.sidebar.logo` when calling `spaces.update()`. 2. If needed, you can customize the theme further with additional css variables. ## Building a theme Learn how to create a Space with a theme, and update a theme from an Event listener. ```javascript import api from "@flatfile/api"; export default function flatfileEventListener(listener) { //listen for space:configure job and build out space listener.filter({ job: "space:configure" }, (configure) => {}); //theme during creation or update your space after it is created listener.on("space:created", async ({ context: { spaceId } }) => { const updateSpace = await api.spaces.update(spaceId, { metadata: { theme: { root: { primaryColor: "red", }, sidebar: { logo: "https://image.png", }, // See reference for all possible variables }, }, }); }); } ``` See full code example in our [flatfile-docs-kitchen-sink Github repo](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/blob/main) ## Theme reference ### theme.root ![Theme root configuration example](https://mintlify.s3.us-west-1.amazonaws.com/flatfileinc/guides/assets/theme-root.png) The sidebar, table and document will automatically inherit theming from your root variables! #### Font **fontFamily** `string`\ The font family used throughout the application. Only system fonts are supported at the moment #### Colors **primaryColor** `string`\ The primary color used throughout the application. **dangerColor** `string`\ The color used for error messages. **warningColor** `string`\ The color used for warning messages. **borderColor** `string`\ The color used for borders throughout the application. ### theme.sidebar ![Theme sidebar configuration example](https://mintlify.s3.us-west-1.amazonaws.com/flatfileinc/guides/assets/theme-sidebar.png) You can override the default colors of the sidebar below. If these are not set they will inherit from your root colors. **logo** `string`\ The img path for the logo displayed in the sidebar. **textColor** `string`\ The color of the text in the sidebar. **titleColor** `string`\ The color of the title in the sidebar. **focusBgColor** `string`\ The background color of the active navigation link. The hover state will use the same color with 30% opacity applied. **focusTextColor** `string`\ The text color of the active/focused navigation link. **backgroundColor** `string`\ The background color of the sidebar. ### theme.table ![Theme table configuration example](https://mintlify.s3.us-west-1.amazonaws.com/flatfileinc/guides/assets/theme-table.png) You can override the default colors and values for different elements in the table below. If these are not set they will inherit from your root values. #### Font **fontFamily** `string`\ The font family used throughout the table. Only system fonts are supported at the moment #### Active cell **cell.active.borderWidth** `string`\ The width of the border around the active cell. **cell.active.borderShadow** `string`\ The box shadow around the active cell. **cell.number.fontFamily** `string`\ The font family for number cells. #### Column header ![Theme table column configuration](https://mintlify.s3.us-west-1.amazonaws.com/flatfileinc/guides/assets/theme-table-column.png) **column.header.backgroundColor** `string`\ The background color of the column headers in the table. **column.header.color** `string`\ The text color of the column headers in the table. #### Index column ![Theme table index column configuration](https://mintlify.s3.us-west-1.amazonaws.com/flatfileinc/guides/assets/theme-table-index-column.png) **indexColumn.backgroundColor** `string`\ The background color of the first column in the table. **indexColumn.color** `string`\ The text color of the first column in the table. **indexColumn.selected.backgroundColor** `string`\ The background color of the first column in the table when selected. #### Checkboxes ![Theme table inputs configuration](https://mintlify.s3.us-west-1.amazonaws.com/flatfileinc/guides/assets/theme-table-inputs.png) **inputs.checkbox.color** `string`\ The color of the checkboxes in the table. **inputs.checkbox.borderColor** `string`\ The color of the border for the checkboxes in the table. #### Filters **filters.outerBorderRadius** `string`\ The border radius of the table filters **filters.innerBorderRadius** `string`\ The border radius for the individual filters **filters.outerBorder** `string`\ The border of the table filters. By default there is no border. > When nested objects share the same border radius, small gaps may appear. To resolve this, adjust the inner and outer border radius on the filters to seamlessly fill any gaps. #### Buttons **buttons.iconColor** `string`\ The color of the icon buttons in the toolbar and table **buttons.pill.color** `string`\ The color of the pill buttons in the toolbar ### theme.email You can theme the guest invite emails as well as guest email verification emails via the properties below. These are sent to a guest when they are invited to a Space via the guest management page in your Space. > Email theming is only available on the pro and enterprise plans. **logo** `string`\ The URL of the image displayed at the top of the email **textColor** `string`\ The color of the text in the email **titleColor** `string`\ The color of the title in the email **buttonBgColor** `string`\ The background color of the button **buttonTextColor** `string`\ The text color of the button **footerTextColor** `string`\ The text color of the footer text **backgroundColor** `string`\ The background color of the email #### theme.email.dark If your default email theme (as set above) is light, we suggest adding a dark mode email theme. Different email providers handle dark and light mode differently. While Apple Mail and some other clients will respect dark mode variables, some email providers do not support dark mode and will display your default theme. We suggest you test your themes across various email clients before going to production to ensure consistent appearance. **logo** `string`\ The URL of the image displayed at the top of the email in dark mode **textColor** `string`\ The color of the text in the email in dark mode **titleColor** `string`\ The color of the title in the email in dark mode **buttonBgColor** `string`\ The background color of the button in dark mode **buttonTextColor** `string`\ The text color of the button in dark mode **footerTextColor** `string`\ The text color of the footer text in dark mode **backgroundColor** `string`\ The background color of the email in dark mode ## Deprecation Notice Flatfile's new design system released in Q1 2025 supports a more streamlined experience when configuring theme. The new system accepts a more limited set of properties, but ensures those supplied properties are cohesively applied across the application. As a result, many of the original `theme.root` properties which applied to specific UI elements have been deprecated. We have taken steps to ensure some level of backwards compatibility for these properties in the new system - however we recommend any usage of these properties be updated to the new system as soon as possible. ## Example Project Find the theming example in the Flatfile GitHub repository. * [Clone the theming example in Typescript](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/blob/main/typescript/theming/index.ts) * [Clone the theming example in Javascript](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/blob/main/javascript/theming/index.js) # Internationalization Source: https://flatfile.com/docs/guides/translating-your-space Translate and customize your Space with text overrides Tailor the language and text in your Flatfile Space with ease! This guide will lead you step-by-step to allow you to both translate and personalize the text in your Space in just a couple of simple steps. ## Available Languages We support translations into these languages: | Language | Code | | ----------------------- | ---------- | | German | de | | English | en | | English - Great Britain | en-GB | | English - Canadian | en-CA | | English - South African | en-ZA | | Spanish | es | | Spanish - Latin America | es-419 | | French | fr | | French - Canadian | fr-CA | | French - France | fr-FR | | Indonesian | id | | Italian | it | | Japanese | jp | | Korean | kr | | Malay | ms | | Dutch | nl | | Polish | pl | | Portuguese | pt | | Portuguese - Brazilian | pt-BR | | Swedish | sv | | Thai | th | | Turkish | tr | | Vietnamese | vi | | Chinese - Simplified | zh | | Chinese - Traditional | zh-Hant | | Chinese - Hong Kong | zh-Hant-HK | Need more languages? Reach out to us at [support@flatfile.io](mailto:support@flatfile.io). ## Setting the language on a Space Flatfile will auto-detect the user's language based on their browser settings. If the language is not detectable or is currently unsupported, the system will automatically be set to English. ### Overriding the language You can manually set the language on a Space using the `languageOverride` property. This property can be set either directly on the Space or at the environment level if the Space has not yet been created. All users who use the Space will only see the language set in the language override property. Browser settings will no longer be respected. To specify the language, simply assign the appropriate language code (e.g., 'en' for English, 'fr' for French, etc.) to the languageOverride property. ```typescript listener.ts listener.on("space:created", async ({ context: { spaceId } }) => { // This space will be in English regardless of the users browser settings const updateSpace = await api.spaces.update(spaceId, { languageOverride: "en", }); }); ``` ## Adding translations ### Creating your translation files All our translation files can be found in our [Platform-Translations repository](https://github.com/FlatFilers/Platform-Translations). You can fork the repo or simply copy paste the files in the repository to create your own translation files. Simply update the values in each file with the text you would like to see. Once you have created your own translation files, you need to make them available to Flatfile. This is done via the translationsPath property. While creating your JSON translation files, you can opt for partial or full JSON objects: **Partial JSON Objects:** Just include those values you want to change. Be sure to maintain the same JSON structure to avoid errors. **Full JSON Objects:** Copy the translation file from our public repository and modify the values as required. ### Setting your Translations Path To provide translations and text overrides you can use the translationsPath parameter. You have the flexibility to set up translationsPaths at two levels: * **Space:** Overrides the environment level setting. * **Environment:** Used when no Space-level path is defined. Set your translationsPath to the URL of your **english JSON file**. This should be the URL ending in `/locales/en/translation.json`. Don't have an english file? You can use any other language instead. Your `translationsPath` must meet a few requirements to work as expected: 1. It must be a string. 2. The string should be a publicly available URL that returns your JSON translation file. 3. It must follow this folder structure: ``` yourCustomTranslationsDirectory/ └── locales/ ├── en/ │ └── translation.json ├── es/ │ └── translation.json ... ``` where `yourCustomTranslationsDirectory` can be swapped out for any path desired. For our translations repository we would set our translationsPath to: [https://raw.githubusercontent.com/FlatFilers/Platform-Translations/main/locales/en/translation.json](https://raw.githubusercontent.com/FlatFilers/Platform-Translations/main/locales/en/translation.json) ### Implementing Translations Path Set up your translations path using our API or the listener. This can be done at both the Space and environment levels. ```typescript listener.ts listener.on("space:created", async ({ context: { spaceId } }) => { //set the translation path for the Space const updateSpace = await api.spaces.update(spaceId, { translationsPath: "https://raw.githubusercontent.com/FlatFilers/Platform-Translations/kitchen-sink/locales/en/translation.json", }); }); ``` Currently we only offer translations and text overrides for the guest experience within your Spaces. These means there will be admin only areas that are not currently customizable. ## Translating Your Custom Actions and Jobs You can translate your own custom text used in your Actions. Simply add the new keys to your translation.json file: ```json .../locales/en/translation.json { "mySubmitAction": { "label": "Submit", "description": "Done editing the data? Hit submit to send your data", "tooltip": "Submit your data", "outcome": { "heading": "Successful import!", "message": "All data has been submitted! You can now click the link below.", "urlLabel": "Return to the homepage" } } } ``` Then instead of hardcoding strings directly into your action scripts, use JSON translation keys. Refer to these keys in your script as shown below: ```javascript sheet.js sheets : [ { name: "Sheet Name", actions: [ { operation: 'Submit', mode: 'foreground', label: 'mySubmitAction.label', type: 'string', description: 'mySubmitAction.description', primary: true, tooltip: 'mySubmitAction.tooltip' }, {...} ] } ] ``` ```javascript listener.js sheets : [ { name: "Sheet Name", actions: [ { operation: 'Submit', mode: 'foreground', label: 'mySubmitAction.label', type: 'string', description: 'mySubmitAction.description', primary: true, tooltip: 'mySubmitAction.tooltip' }, {...} ] } ] ``` If a translation key is missing from your `translation.json` file, the system will display the key itself as a fallback. Therefore, it's encouraged to always have a default language translation available to avoid showing raw keys to the users. ## Translating Your Documents The same logic that applies to Actions also applies to your Documents. Just pass in the custom translation keys that you added to your translation files! ```json .../locales/en/translation.json { "myDocument": { "title": "Your first Space!", "body": "# Welcome\n\n### Say hello to your first Space in the new Flatfile!" } } ``` ```javascript listener.js // Reference the translation keys in your document configuration ``` ## Guidelines for Creating JSON Keys When creating JSON keys, adhere to the following guidelines: * **Alphanumeric Keys**: Ensure keys consist of alphanumeric characters to prevent issues with parsing. * **Nested Structure**: Utilize at least one level of nesting (e.g., `key1.key2`) to organize translations effectively and maintain readability. * **Avoid Whitespace and Control Characters**: Refrain from using spaces, newlines, or tabs in your JSON keys to avoid potential issues. * **Don't End Keys with File Extensions**: Avoid using file extensions (e.g., .csv, .txt) as the terminal segment in a key. For instance, myFileAction.csv is discouraged, but myFileAction.csv.uploaded is acceptable. Always validate your `translation.json` file to ensure it follows the correct JSON format to prevent errors during runtime. ## Example Project Find the Localization example in the Flatfile GitHub repository. * [Clone the Localization example in Typescript](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/blob/main/typescript/localization/index.ts) * [Clone the Localization example in Javascript](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/blob/main/javascript/localization/index.js) # Actions Source: https://flatfile.com/docs/guides/using-actions Trigger custom operations based on user input An Action is a code-based operation that runs where that Action is mounted. Actions run when a user clicks the corresponding user prompt in Flatfile. ## Overview When an Action is triggered, it creates a [Job](/core-concepts/jobs) that your application can listen for and respond to. Given that Actions are powered by Jobs, the [Jobs Lifecycle](/core-concepts/jobs#jobs-lifecycle) pertains to Actions as well. This means that you can [update progress values/messages](/core-concepts/jobs#updating-job-progress) while an Action is processing, and when it's done you can provide an [Outcome](/core-concepts/jobs#job-outcomes), which allows you to show a success message, automatically [download a generated file](/core-concepts/jobs#file-downloads), or [forward the user](/core-concepts/jobs#internal-navigation) to a generated Document. [Actions](/core-concepts/actions) are mounted on resources like **Workbooks**, **Sheets**, **Fields**, **Documents**, and **Files**. Generally, [Workbook](#workbook-actions), [Sheet](#sheet-actions), [Field](#field-actions), and [Document](#document-actions) Actions are configured within a Blueprint object, while [File](#file-actions) Actions are appended to the file during the upload process. Alternatively, Actions can be mounted to any of these resources via API in a [Listener](/core-concepts/listeners). [Sheet](#sheet-actions) Actions can be executed on the entire Sheet, for a filtered view of the Sheet, or selectively for the chosen records. See [Sheet Action Execution Modes](#sheet-action-execution-modes) for details on how actions handle different data selections. In these examples, we'll show the full Job Listener lifecycle implementation, complete with `ack` to acknowledge the job, `update` to update the job's progress, and `complete` or `fail` to complete or fail the job. To make this simpler in practice, we provide a plugin called [Job Handler](/plugins/job-handler) that handles the job lifecycle for you. This plugin works by listening to the `job:ready` event and executing the handler callback, even catching errors to fail the job. There is also an optional `tick` function which allows you to update the Job's progress. For example: With the [Job Handler](/plugins/job-handler) plugin, the 35-line [File Action Listener](#listener-implementation-4) defined below would be implemented simply as: ```typescript listener.use(jobHandler("file:logFileContents", async ({ context: { fileId, jobId } }, tick) => { const file = await api.files.get(fileId); tick(10, "Getting started."); console.log({ file }); tick(90, "Logged."); return { outcome: { message: "Logging file contents is complete.", } } })); ``` ## Workbook Actions Once a user has extracted and mapped data into a Workbook, it may be more efficient to run an operation on the entire dataset rather than making atomic transformations at the record- or field-level. For example: * Sending a webhook that notifies your API of the data's readiness * Populating a Sheet with data from another source * Adding two different fields together after a user review's initial validation checks * Moving valid data from an editable Sheet to a read-only Sheet Workbook-mounted actions are represented as buttons in the top right of the Workbook. ### Usage If you configure `primary: true` on an Action, its button will be highlighted in the Workbook. If you configure `trackChanges: true`, it will disable your actions until all commits are complete (usually data hooks). #### Blueprint Configuration First, configure your action in your Blueprint: ```javascript // Add this to your workbook configuration actions: [ { operation: "downloadXML", mode: "background", label: "Download XML", type: "string", description: "Generates and downloads an XML file with all the users in the workbook", primary: false, }, { operation: "submitToAPI", mode: "background", label: "Submit to API", type: "string", description: "Submits the users to our API", primary: true, }, ], settings: { trackChanges: true, } ``` #### Listener Implementation Next, create a listener to handle the `job:ready` event for your action. ```typescript listener.on( "job:ready", { job: "workbook:submitToAPI" }, async (event) => { const { jobId, workbookId } = event.context; const { data: workbook } = await api.workbooks.get(workbookId); const { data: workbookSheets } = await api.sheets.list({ workbookId }); // Collect all sheet data const sheets = []; for (const [_, element] of workbookSheets.entries()) { const { data: records } = await api.records.get(element.id); sheets.push({ ...element, ...records, }); } try { // Acknowledge the job start await api.jobs.ack(jobId, { info: "Starting job to submit action to webhook", progress: 10, }); // Replace with your actual webhook URL const webhookReceiver = "https://your-app.com/webhook/flatfile"; // Submit data to external system const response = await fetch(webhookReceiver, { method: "POST", headers: { "Content-Type": "application/json", }, body: JSON.stringify({ workbook: { ...workbook, sheets, }, }), }); if (response.status === 200) { await api.jobs.complete(jobId, { outcome: { message: `Data was successfully submitted to ${webhookReceiver}.`, }, }); } else { throw new Error("Failed to submit data to webhook"); } } catch (error) { console.error(error); await api.jobs.fail(jobId, { outcome: { message: "This job failed. Check your webhook URL and try again.", }, }); } } ); ``` ## Sheet Actions Each Sheet has built-in Actions like download. Sheet-mounted actions are represented as a dropdown in the toolbar of the Sheet. ### Usage If you configure `primary: true` on an Action, it creates a top-level button as well as placing it in the dropdown menu. #### Blueprint Configuration First, configure your action on your Blueprint. Add the action configuration to your sheet definition: ```javascript sheets : [ { name: "Users", actions: [ { operation: "populateHealthScores", mode: "background", label: "Populate Health Scores", type: "string", description: "Populate health scores for all users", primary: true, }, { operation: "validateHealthScores", mode: "background", label: "Validate Health Scores", type: "string", description: "Validate health scores for all users", primary: false, }, ] } ] ``` #### Listener Implementation Next, listen for a `job:ready` and filter on the `domain` (sheet) and the `operation` of where the action was placed. Be sure to complete the job when it's done. ```typescript listener.on( "job:ready", { job: "sheet:populateHealthScores" }, async ({ context: { jobId } }) => { try { await api.jobs.ack(jobId, { info: "Populating health scores...", progress: 10, estimatedCompletionAt: new Date("Tue Aug 23 2023 16:19:42 GMT-0700"), }); // Populate health scores for all users // Implementation logic here await api.jobs.complete(jobId, { info: "Health scores populated successfully.", }); } catch (error) { console.error("Error:", error.stack); await api.jobs.fail(jobId, { info: "Failed to populate health scores.", }); } } ); ``` ### Sheet Action Execution Modes Sheet Actions are powerful because they can operate on different subsets of data within a sheet. When a Sheet Action is triggered, it automatically receives context about what data should be processed based on the user's current view and selections. #### How Sheet Actions Handle Data Context When a Sheet Action job is created, it receives a `query` object in the job's subject that specifies which records to process: ```javascript // Job subject structure for Sheet Actions { type: "collection", resource: string, // Usually sheetId or "records" query: { filter: "valid" | "error" | "all" | "none", // Filter type filterField: string, // Field to filter on (optional) searchField: string, // Field to search in (optional) searchValue: string, // Search term (optional) q: string, // FFQL (Flatfile Query Language) query (optional) ids: string[] // Record IDs - INCLUDE when no filter, EXCLUDE when filter applied }, params: { sheetId: string, // Always present for sheet actions // Additional params depending on action type } } ``` #### Three Execution Contexts **1. Entire Sheet (No Filter, No Selection)** * Processes all records in the sheet * `query.filter` is `"all"` or undefined * No `ids` or `exceptions` arrays **2. Filtered View** * Processes only records matching the current filter * `query.filter` indicates the filter type: * `"valid"` - Only valid records (no errors) * `"error"` - Only records with validation errors * Custom filters if applied * May include search parameters if user has searched * May include [FFQL queries](/reference/ffql) via the `q` parameter for advanced filtering **3. Selected Records** * Processes only user-selected records * **When no other filters applied**: `query.ids` contains specific record IDs to INCLUDE * **When filters are applied**: `query.ids` contains record IDs to EXCLUDE from the filtered results #### Implementation Example ```typescript listener.on( "job:ready", { job: "sheet:processData" }, async (event) => { const { jobId, sheetId } = event.context; // Get the job to access the query context const { data: job } = await api.jobs.get(jobId); const query = job.subject?.query || {}; // Pass the query directly to the records API // The API handles the inclusion/exclusion logic for ids based on filters const { data: records } = await api.records.get(sheetId, { ...query // Includes filter, ids, searchField, searchValue, etc. }); console.log(`Processing ${records.length} records`); console.log(`Query context:`, query); // Process the records for (const record of records) { // Your processing logic here... } await api.jobs.complete(jobId, { outcome: { message: `Successfully processed ${records.length} records`, }, }); } ); ``` #### Built-in Action Examples Built-in actions like Delete and Download demonstrate these execution modes: * **Delete All** - Removes all records when no selection/filter * **Delete Selected** - Removes only selected records * **Delete Valid/Invalid** - Removes records matching the current filter * **Download Filtered** - Exports only records matching current view Your custom Sheet Actions automatically inherit this same context-aware behavior. ## Field Actions Field-mounted actions are operations that can be performed on individual columns/fields within a Sheet. They appear as options in the dropdown menu when you click on a column header. Field actions are particularly useful for: * Column-specific data transformations (e.g., capitalizing all values in a name field) * Field-level validation operations * Data formatting specific to a column type * Column-specific cleanup operations ### Usage The `primary` property does not affect the UI for Field actions. Note: Field Actions are essentially an extension of Sheet actions - therefore their operation names are prefixed with `sheet:`, and the column key is available in the job data. #### Blueprint Configuration First, configure your field action in your Blueprint by adding it to the field definition: ```javascript // Add this to your field configuration within a sheet { key: "firstName", type: "string", label: "First Name", description: "Customer's first name", actions: [ { operation: "capitalize", mode: "foreground", label: "Capitalize All Values", description: "Capitalizes the values of the selected column", } ] } ``` #### Listener Implementation Next, create a listener to handle the `job:ready` event for your field action. Field actions use the `sheet:operationName` job pattern and provide field context through the job's subject parameters. ```typescript listener.on( "job:ready", { job: "sheet:capitalize" }, async (event) => { const { jobId, sheetId } = event.context; try { // Acknowledge the job start await api.jobs.ack(jobId, { info: "Starting job to capitalize all values", progress: 10, }); // Get the field key from the job subject const { data: { subject } } = await api.jobs.get(jobId); const fieldKey = subject.params.columnKey; // Get all records from the sheet const { data: { records } } = await api.records.get(sheetId); await api.jobs.update(jobId, { info: "Capitalizing all values", progress: 20, }); // Process records for the specific field const updatedRecords = []; records.forEach(record => { const value = record.values[fieldKey].value as string; if (value && value.length > 0) { const capitalizedValue = value.toUpperCase(); if (value !== capitalizedValue) { record.values[fieldKey].value = capitalizedValue; updatedRecords.push(record); } } }); await api.jobs.update(jobId, { info: "Updating records", progress: 60, }); // Update only the modified records if (updatedRecords.length > 0) { await api.records.update(sheetId, updatedRecords); } await api.jobs.complete(jobId, { outcome: { message: `Successfully capitalized ${updatedRecords.length} values in the ${fieldKey} column.`, }, }); } catch (error) { console.error(error); await api.jobs.fail(jobId, { outcome: { message: "Failed to capitalize field values: " + error.message, }, }); } } ); ``` ### Key Differences from Other Action Types Field actions have several unique characteristics: * **Job Pattern**: Use `sheet:operationName` * **Context Access**: Field key is available via `job.subject.params.columnKey` * **UI Location**: Appear in column header dropdown menus * **Configuration**: Defined within individual field objects in the Blueprint ## Document Actions Document-mounted actions are similar to Workbook-mounted actions. They appear in the top right corner of your Document. ### Usage If you configure `primary: true` on an Action, it will be highlighted in the Document. #### Document Configuration Define Document-mounted Actions using the `actions` parameter when you create a Document. ```typescript import api from "@flatfile/api"; export default function flatfileEventListener(listener) { listener.on("file:created", async ({ context: { spaceId, fileId } }) => { const fileName = (await api.files.get(fileId)).data.name; const bodyText = "# Welcome\n" + "### Say hello to your first customer Space in the new Flatfile!\n" + "Let's begin by first getting acquainted with what you're seeing in your Space initially.\n" + "---\n" + `Your uploaded file, ${fileName}, is located in the Files area.`; const doc = await api.documents.create(spaceId, { title: "Getting Started", body: bodyText, actions: [ { label: "Submit", operation: "contacts:submit", description: "Would you like to submit the contact data?", tooltip: "Submit the contact data", mode: "foreground", primary: true, confirm: true, }, ], }); }); } ``` #### Listener Implementation In your listener, listen for the job's event and perform your desired operations. ```typescript export default function flatfileEventListener(listener) { listener.on( "job:ready", { job: "document:contacts:submit" }, async (event) => { const { context, payload } = event; const { jobId, workbookId } = context; try { await api.jobs.ack(jobId, { info: "Starting submit job...", progress: 10, estimatedCompletionAt: new Date("Tue Aug 23 2023 16:19:42 GMT-0700"), }); // Do your work here await api.jobs.complete(jobId, { outcome: { message: `Submit job was completed succesfully.`, }, }); } catch (error) { console.log(`There was an error: ${JSON.stringify(error)}`); await api.jobs.fail(jobId, { outcome: { message: `This job failed.`, }, }); } } ); } ``` ## File Actions Each file has built-in actions like Delete and Download. File-mounted actions are represented as a dropdown menu for each file in the Files list. You can attach additional actions to a File. ### Usage You can attach additional actions to a File by listening for file events and updating the file's actions array. #### File Configuration First, listen for a `file:ready` event and add one or more actions to the file. ```typescript listener.on("file:created", async ({ context: { fileId } }) => { const file = await api.files.get(fileId); const actions = file.data?.actions || []; const newActions = [ ...actions, { operation: "logFileContents", label: "Log File Metadata", description: "This will log the file metadata.", primaty: true, } ]; await api.files.update(fileId, { actions: newActions, }); }); ``` #### Listener Implementation Next, listen for `job:ready` and filter on the `domain` (file) and the `operation` of where the Action was placed. Be sure to complete the job when it's done. ```typescript listener.on( "job:ready", { job: "file:logFileContents" }, async ({ context: { fileId, jobId } }) => { try { await api.jobs.ack(jobId, { info: "Getting started.", progress: 10, estimatedCompletionAt: new Date(Date.now() + 10000), }); const file = await api.files.get(fileId); console.log({ file }); await api.jobs.update(jobId, { info: "Logged.", progress: 90, estimatedCompletionAt: new Date(Date.now() + 10000), }); await api.jobs.complete(jobId, { outcome: { message: "Logging file contents is complete.", }, }); } catch (error) { await api.jobs.fail(jobId, { outcome: { message: "Logging file contents failed: " + error.message, }, }); } } ); ``` ## Action Parameters ### Required Parameters | Parameter | Type | Description | | ----------- | -------- | -------------------------------------------------------------------------------------------------------------------------- | | `operation` | `string` | A unique identifier for the Action that is used by the listener to determine what work to do as part of the resulting Job. | | `label` | `string` | The text that will be displayed to the user in the UI as a button or menu item. | ### Optional Parameters | Parameter | Type | Default | Description | | ------------- | ----------- | -------------- | ---------------------------------------------------------------------------------------------------------------------- | | `primary` | `boolean` | `false` | Whether the Action is a primary Action for the resource. Primary Actions are displayed more prominently in the UI. | | `confirm` | `boolean` | `true` | When set to true, a modal is shown to confirm the Action. | | `description` | `string` | - | The text displayed to the user when a confirmation modal is used. Phrase this as a question. | | `icon` | `string` | lightning bolt | The icon to be displayed. Use `'none'` to omit the icon. See [Flatfile icons](/reference/icons) for available options. | | `tooltip` | `string` | - | Text displayed as a tooltip when hovering over the action button or menu item. | | `messages` | `array[{}]` | - | Custom messages displayed as tooltips based on action state. Supports `[{ type: 'error' }]` and `[{ type: 'info' }]`. | | `constraints` | `array[{}]` | - | Conditions that disable the action. Options: `hasAllValid`, `hasSelection`, `hasData`. | | `mode` | `string` | `background` | Execution mode: `foreground`, `background`, or `toolbarBlocking`. | #### Constraint Types * `hasAllValid`: Disables action when there are invalid records * `hasSelection`: Disables action when no records are selected (Sheet actions only) * `hasData`: Disables action when there are no records #### Mode Types * `foreground`: Prevents interacting with the entire resource until complete * `background`: Runs in the background without blocking the UI * `toolbarBlocking`: Disables sheet-level toolbar and column header menus while allowing manual record entry ### Usage An Action with all of the above properties would look like this: ```javascript { operation: 'my-action', label: 'My Action', primary: true, confirm: true, description: 'Are you sure you want to run this action?', constraints: [{ type: 'hasAllValid' }, { type: 'hasSelection' }], mode: 'foreground' tooltip: 'Click to run action' } ``` ## Input Forms When initiating an action, there may be instances where additional information is required from the end user to successfully complete the intended task. For example, you might want to enable users to specify the name of the file they intend to export. In such cases, if you configure input fields for your action, a secondary dialog will be presented to the end user, prompting them to provide the necessary information. Once the user has entered the required details, they can proceed with the action seamlessly. ### Configuration | Parameter | Type | Required | Description | | --------- | --------------- | -------- | ------------------------------------------------------------ | | `type` | `string` | ✓ | The type of the input form. Currently accepts: `simple` | | `fields` | `array[object]` | ✓ | An array of field objects representing the input form fields | ### Fields | Parameter | Type | Required | Description | | -------------- | --------------- | -------- | ---------------------------------------------------------------- | | `key` | `string` | ✓ | The key for the field | | `label` | `string` | ✓ | The label for the field | | `type` | `string` | ✓ | Field type: `string`, `textarea`, `number`, `boolean`, or `enum` | | `defaultValue` | `string` | - | The default value for the field | | `description` | `string` | - | A description of the field | | `config` | `object` | - | Configuration options for the field (required for `enum` type) | | `constraints` | `array[object]` | - | An array of constraints for the field | ### Config (for enum fields) | Parameter | Type | Required | Description | | --------- | --------------- | -------- | --------------------------------- | | `options` | `array[object]` | ✓ | An array of options for the field | ### Options | Parameter | Type | Required | Description | | ------------- | -------- | -------- | --------------------------------------------------- | | `value` | `string` | ✓ | The value or ID of the option | | `label` | `string` | - | A visual label for the option | | `description` | `string` | - | A short description of the option | | `meta` | `object` | - | An arbitrary JSON object associated with the option | ### Field Constraints | Parameter | Type | Required | Description | | --------- | -------- | -------- | ----------------------------------------------------- | | `type` | `string` | ✓ | The type of constraint. Currently accepts: `required` | ### Usage First, configure your action to have an inputForm on your Blueprint. These will appear once the action button is clicked. ```javascript actions: [ { operation: "submitActionFg", mode: "foreground", label: "Submit data elsewhere", type: "string", description: "Submit this data to a webhook.", primary: true, inputForm: { type: "simple", fields: [ { key: "priority", label: "Priority level", description: "Set the priority level.", type: "enum", defaultValue: "80ce8718a21c", config: { options: [ { value: "80ce8718a21c", label: "High Priority", description: "Setting a value to High Priority means it will be prioritized over other values", }, ], }, constraints: [ { type: "required", }, ], }, ], }, }, ]; ``` Next, listen for a `job:ready` and filter on the `job` you'd like to process. Grab the data entered in the form from the job itself and leverage it as required for your use case. ```typescript import api from "@flatfile/api"; export default async function (listener) { listener.on( "job:ready", { job: "workbook:actionWithInput" }, async (event) => { const { jobId } = event.context; try { await api.jobs.ack(jobId, { info: "Acknowledging job", progress: 1, }); // retrieve input const { data: job } = await api.jobs.get(jobId); const input = job.input; console.log({ input }); // do something with input... await api.jobs.complete(jobId, { outcome: { message: "Action was successful", }, }); return; } catch (error) { console.error(error); await api.jobs.fail(jobId, { outcome: { message: "Action failed", }, }); return; } } ); } ``` ## Constraints ### Usage #### Workbook & Sheet Actions 1. Adding a `hasAllValid` constraint on an Action will disable a Workbook Action when there are invalid records. 2. Adding a `hasData` on an Action will disable a Workbook Action when there are no records. ```javascript actions: [ { operation: 'submitActionFg', mode: 'foreground', label: 'Submit data elsewhere', description: 'Submit this data to a webhook.', primary: true, constraints: [{ type: 'hasAllValid'},{ type: 'hasData' }] }, {...} ], ``` #### Sheet Actions Only Adding a constraint of `hasSelection` on an Action will disable a Sheet Action when no records in the Sheet are selected. ```javascript sheets : [ { name: "Sheet Name", actions: [ { operation: 'duplicate', mode: 'background', label: 'Duplicate selected names', description: 'Duplicate names for selected rows', constraints: [{ type: 'hasSelection' }], primary: true, }, {...} ] } ] ``` ## Messages Add custom messages to actions, tailored according to their state: * Error * Info These messages will be displayed as tooltips when users hover over an action, providing context-specific text that corresponds to the action's current state. When an error message is present on an action, the action will be disabled. ### Usage Simply add a messages property to your action configuration. This property should be an array of objects, each specifying a message type and its content. ```javascript actions: [ { operation: 'duplicate', mode: 'background', label: 'Duplicate selected names', description: 'Duplicate names for selected rows', messages: [ { type: 'error', content: 'This is an error message' }, ], primary: true, }, {...} ] ``` # Using Record Hooks Source: https://flatfile.com/docs/guides/using-record-hooks Process data with Record Hooks **Hooks** are concise functions that automatically **re-format**, **correct**, **validate**, and **enrich** data during the data import process. These hooks can be executed on a complete record, or row, of data using methods on the `FlatfileRecord` class. Record-level hooks have access to all fields in a row and should be utilized for operations that require access to multiple fields or when creating a new field. ## Getting started The `FlatfileRecord` class provides methods to manipulate and interact with a record in the Flatfile format, including setting values, retrieving values, adding information messages, performing computations, validating fields, and converting the record to JSON format. Use the [Record Hook plugin](/plugins/record-hook) to listen for updates to data records and respond with three types of record hooks: `compute`, `computeIfPresent`, and `validate`. These hooks are available as methods on the `FlatfileRecord` class. For complete installation instructions, configuration options, and detailed examples, see the [Record Hook plugin documentation](/plugins/record-hook). ## Record Hook Types Record hooks provide three main operations for data processing: ### `compute` Transforms field values based on the original value or other fields in the record. Always runs, even when no initial value is set. Use this for generating new values or creating computed fields. ### `computeIfPresent` Similar to `compute`, but only runs when the field has an initial value. Ideal for transformations that might fail on empty values, such as string operations or mathematical calculations. ### `validate` Validates field values against custom conditions and marks fields as invalid with error messages when they fail validation. Use this for complex validation rules beyond built-in field constraints. For detailed syntax, parameters, and implementation examples of these record hooks, see the [Record Hook plugin documentation](/plugins/record-hook). ## Record Messages The `FlatfileRecord` class provides methods to attach contextual messages to fields: * **`addInfo()`**: Adds informational messages in a tooltip to help users understand transformations or provide context * **`addWarning()`**: Adds warning messages to alert users of potential issues * **`addError()`**: Marks fields as invalid with error messages, blocks [Actions](/core-concepts/actions) with the `hasAllValid` constraint These methods accept either a single field name or an array of field names, along with a message string. For detailed syntax and examples, see the [Record Hook plugin documentation](/plugins/record-hook). ## Accessing External APIs Record hooks can integrate with external APIs to enrich, validate, or transform data in real-time. This is useful for: * **Data Enrichment**: Fetching additional information from external services * **Real-time Validation**: Verifying data against external systems * **Data Transformation**: Using external services to process or format data * **Webhook Integration**: Sending processed data to external systems ### Common Use Cases * **GET Requests**: Retrieve data from external APIs to enrich records * **POST Requests**: Send record data to webhooks or external systems for processing * **Authentication**: Validate credentials or tokens against external services * **Geocoding**: Convert addresses to coordinates using mapping services For complete examples of API integration patterns, including error handling and async processing, see the [Record Hook plugin documentation](/plugins/record-hook). ## getLinks The `getLinks` method is a feature of the FlatfileRecord class. When a field in your record is of the Reference Field type and links to another record, getLinks can fetch those linked fields for you. ### Usage When processing a record, you may find references to a related record. To retrieve the fields from the related record, use the `getLinks` method. Provide the field key of the Reference Field type, the part of the record that holds the reference to the other records, like this: ```javascript const relatedRecords = record.getLinks("referenceFieldKey"); ``` The getLinks method will then return an array containing all fields from the linked record associated with the provided 'referenceFieldKey'. If there is not a linked record associated with this field, an empty array will be returned. ### Benefits Using `getLinks` provides access to all related information. It's particularly useful when you want to perform operations similar to VLOOKUPs in spreadsheets, or when you need to compare data across referenced fields. For instance, you could use `getLinks` to fetch all the fields related to a particular record and enrich your data, or compare the related records for validation. With `getLinks`, processing related datasets becomes much more manageable in Flatfile. This method provides you with an effective way to navigate, enrich, and validate your data. ## Deleting records There are primarily two use cases for deleting records: 1. Deleting a subset of records 2. Deleting all records in a sheet ### Deleting a subset of records To delete a subset of records first import the `@flatfile/api` package, then use the `api.records.delete()` helper method. This method takes in an array of record IDs and deletes them from the sheet. ```javascript await api.records.delete(sheetId, { ids: [...] }); ``` ### Deleting all records in a sheet For clearing an entire sheet of its records, set up a bulk delete job. This task will comprehensively wipe out every record on the specified sheet. Check out our [jobs documentation](/core-concepts/jobs). ```javascript await api.jobs.create({ type: "workbook", operation: "delete-records", trigger: "immediate", source: workbookId, config: { sheet: sheetId, filter: "all", }, }); ``` * [Clone the Handling Data example in Typescript](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/blob/main/typescript/handling-data/index.ts) * [Clone the Handling Data example in Javascript](https://github.com/FlatFilers/flatfile-docs-kitchen-sink/blob/main/javascript/handling-data/index.js) # Metadata Source: https://flatfile.com/docs/guides/utilizing-metadata Store descriptive information or data that provides additional context Metadata allows you to store and retrieve additional data about any Flatfile resource without exposing it to end users. This key-value object provides a flexible way to attach custom information, track state, store references, or add contextual data to your resources. ## Universal Usage Metadata can be added during resource creation or updated later using the Flatfile API. The metadata object accepts any valid JSON data and is accessible in all listeners and webhooks. ```javascript await api.spaces.update(spaceId, { metadata: { userId: "user123", companyId: "company456", customField: "any value" } }); ``` ## Resource Types **Environment** - Store deployment details, version info, or environment state\ **Space** - Track user IDs, company information, or session data\ **Workbook** - Add expiration dates, processing flags, or workflow state\ **Record** - Store computed values, external IDs, or transformation flags\ **Field** - Define formatting rules, validation context, or display preferences # Welcome to Flatfile Source: https://flatfile.com/docs/index If you're ready to dive right in, start here: Build your first data import experience in minutes using our powerful AI tooling Build your first data import experience with code using our robust, event-driven SDK ## What is Flatfile? Flatfile is an AI-powered platform that eliminates the weeks and months typically spent migrating customer data, cutting project timelines and labor hours by over 70%. Build seamless data onboarding experiences with universal file support, effortless mapping, intelligent formatting, and collaborative resolution tools.