The DevXP engineering team hosts office hours every Thursday at 11 a.m. Pacific Time where we answer your questions live and help you get up and running with Flatfile. Join us!

Usage

A key component of Flatfile’s headless automation workflow is the automapping functionality provided by our Automap Plugin.

Upon uploading a file, after file extraction is complete, the data needs to be mapped to a destination Workbook. In a non-headless environment a human might do this manually. But in a headless environment, we’ll automate this process.

We’ll use Automap to do this for us.

Install

First, add the plugin to your project.

npm install @flatfile/plugin-automap

Configure

Next, configure the plugin. Choose an accuracy level, a default target sheet, and a regex expression to test incoming files.

accuracy: either confident or exact
defaultTargetSheet: the name of the sheet you want to map to
matchFilename: regex that will match on incoming files

import { automap } from "@flatfile/plugin-automap";

export default function flatfileEventListener(listener) {
  listener.use(
    automap({
      accuracy: "confident",
      defaultTargetSheet: "Contact",
      matchFilename: /^.*contacts\.csv$/,
      onFailure: (event) => {
        console.error(
          `Failed to automap! Please visit https://spaces.flatfile.com/space/${event.context.spaceId}/files?mode=import to manually import file.`
        );
      },
    })
  );
}

Note you may also set debug to true for useful error messages in development.

Deploy and watch it work

Add your configured listener to your project and deploy it to Flatfile.

> npx flatfile deploy

✔  Code package compiled to .flatfile/build.js
✔  Code package passed validation
✔  Environment "Default" selected
✔  Event listener deployed and running on your environment "Default". us_ag_1234


title: “Automate with headless” description: “Learn how to configure Flatfile to deliver a completely automated data import experience.” icon: “bolt-auto”


File feeds are one of the most common ways to exchange batches of data on a regular basis. One key challenge for a traditional feed, however, is its inflexibility. If anything changes about the way source data is extracted, or the destination schema is configured, the feed is all but guaranteed to break.

With Flatfile’s powerful headless data import capabilities, you can seamlessly integrate a headless but adaptable data connection into your system.

How it works

You can achieve a fully automated data exchange between systems in just three steps:

Automate ingress
Implement a flexible data pipeline that seamlessly uploads files to Flatfile without manual intervention, streamlining the process of data intake.

Flatfile processing
Within the Flatfile Platform, build a comprehensive configuration that enables automated extraction, mapping, validation, and transformation of your data.

Automate egress
Finally, leverage Flatfile’s Event driven system to automatically export the processed and transformed data from Flatfile to the designated destination system, eliminating the need for manual exporting and ensuring a continuous flow of data to the intended location.

Automate ingress

Data comes from many sources, and Flatfile can handle them all. Set up automatic feeds via cron job, cloud function, or any other method that works for your business.

All ingress processes will leverage either our API or our SDKs.

First, configure an ingress destination. Our files will be contained in a Space. You can think of a as a micro-application, each has it’s own database and configurations.

npm install @flatfile/plugin-automap

Our deployed agent will automatically configure a Space. For more information on how to configure a Space, see our Dynamic Configurations.

Next, we’ll upload our file to Flatfile.

import { automap } from "@flatfile/plugin-automap";

export default function flatfileEventListener(listener) {
  listener.use(
    automap({
      accuracy: "confident",
      defaultTargetSheet: "Contact",
      matchFilename: /^.*contacts\.csv$/,
      onFailure: (event) => {
        console.error(
          `Failed to automap! Please visit https://spaces.flatfile.com/space/${event.context.spaceId}/files?mode=import to manually import file.`
        );
      },
    })
  );
}

Flatfile Processing

Extract incoming data

Incoming CSV files are automatically extracted to a .

However, if you’re sending a different file type, you’ll need to configure a listener to handle extraction.

See our Extractor Plugins for plug and play options or build your own.

Automate mapping

Once our files are extracted, we’ll need to automate the mapping of our data according to our schema. Our Automap Plugin will automatically map the data, matching based on configureable options.

Validate & Transform

Similarly, validation and transformation operations can run immediately on records:* events with RecordHooks. See our Data Handling Guide for more information.

Automate egress

Now that we’ve leveraged the power of the Flatfile platform to extract, map, validate, and transform our data, we can automate the egress of our data to our destination system.

We’ll have our listener filter and run our egress logic on commit:completed. This is an event specifically designed to signal the end of all processing tasks.

To enable the commit:completed event, add a settings parameter in your Sheet that includes trackChanges: true. This has the effect of disabling actions on both Sheets and Workbooks until any pending commits have been completed. See Blueprint sheet settings for more information.

Here’s an example where we automate emailing the processed data.

> npx flatfile deploy

✔  Code package compiled to .flatfile/build.js
✔  Code package passed validation
✔  Environment "Default" selected
✔  Event listener deployed and running on your environment "Default". us_ag_1234

Egress Jobs may need to leverage Flatfile Secrets as credentials for outgoing connections. See our Secrets Guide for more information.

To pull a working demo of a headless integration, check out our Example Projects.

Additional paths

To find the additional Flatfile integration paths that may work better for your business, explore our other use cases: